Apr 22 19:55:58.407066 ip-10-0-135-221 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:55:58.407078 ip-10-0-135-221 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:55:58.407085 ip-10-0-135-221 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:55:58.407362 ip-10-0-135-221 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:56:08.504749 ip-10-0-135-221 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:56:08.504764 ip-10-0-135-221 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0cdab374f300423ea2ca76f7c9b8c2ac -- Apr 22 19:58:31.471165 ip-10-0-135-221 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:58:31.860652 ip-10-0-135-221 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:31.860652 ip-10-0-135-221 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:58:31.860652 ip-10-0-135-221 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:31.860652 ip-10-0-135-221 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:58:31.860652 ip-10-0-135-221 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:31.862025 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.861939 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:58:31.867901 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867877 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:31.867901 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867897 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:31.867901 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867901 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:31.867901 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867904 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:31.867901 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867907 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:31.867901 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867910 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867913 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867916 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867919 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867922 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867924 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867927 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867929 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867933 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867935 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867938 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867940 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867944 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867948 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867952 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867955 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867957 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867960 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867970 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:31.868128 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867973 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867976 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867978 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867981 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867983 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867986 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867988 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867991 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867994 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.867997 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868000 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868002 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868005 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868007 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868010 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868012 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868015 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868018 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868020 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868023 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:31.868586 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868025 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868027 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868030 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868033 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868035 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868038 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868041 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868043 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868046 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868048 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868051 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868053 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868056 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868058 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868060 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868063 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868065 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868067 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868070 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868073 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:31.869098 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868077 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868080 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868083 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868085 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868088 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868091 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868094 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868096 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868099 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868102 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868104 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868107 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868111 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868113 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868118 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868122 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868126 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868129 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868131 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868134 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:31.869607 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868137 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.868139 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869061 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869068 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869071 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869074 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869077 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869080 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869082 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869085 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869087 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869090 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869093 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869095 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869098 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869101 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869104 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869107 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869110 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869113 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:31.870116 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869116 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869119 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869121 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869124 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869127 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869129 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869132 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869135 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869138 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869140 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869143 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869147 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869150 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869154 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869156 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869159 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869162 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869164 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869168 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:31.870632 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869170 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869173 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869175 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869178 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869180 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869183 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869185 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869188 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869191 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869195 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869198 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869200 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869203 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869206 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869208 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869211 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869213 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869216 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869218 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869221 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:31.871146 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869224 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869226 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869228 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869231 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869233 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869236 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869239 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869241 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869244 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869246 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869248 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869251 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869253 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869256 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869258 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869261 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869264 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869267 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869269 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869272 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:31.871676 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869274 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869276 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869279 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869282 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869285 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869287 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869290 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869293 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.869295 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869382 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869393 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869404 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869425 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869432 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869437 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869444 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869448 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869452 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869455 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869458 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869462 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869465 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:58:31.872181 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869468 2583 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869471 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869474 2583 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869477 2583 flags.go:64] FLAG: --cloud-config="" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869479 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869482 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869486 2583 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869490 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869493 2583 flags.go:64] FLAG: --config-dir="" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869495 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869499 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869502 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869505 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869509 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869512 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869516 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869519 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869522 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869525 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869528 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869533 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869536 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869539 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869542 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869545 2583 flags.go:64] FLAG: --enable-server="true" Apr 22 19:58:31.872717 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869548 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869553 2583 flags.go:64] FLAG: --event-burst="100" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869556 2583 flags.go:64] FLAG: --event-qps="50" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869559 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869562 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869565 2583 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869569 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869573 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869576 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869579 2583 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869582 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869585 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869588 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869591 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869594 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869597 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869600 2583 flags.go:64] FLAG: --feature-gates="" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869603 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869606 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869610 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869613 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869616 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869620 2583 flags.go:64] FLAG: --help="false" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869623 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-135-221.ec2.internal" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869626 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:58:31.873330 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869629 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869632 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869635 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869639 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869642 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869644 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869647 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869650 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869653 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869656 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869659 2583 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869662 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869665 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869668 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869671 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869674 2583 flags.go:64] FLAG: --lock-file="" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869676 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869679 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869682 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869688 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869690 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869693 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869696 2583 flags.go:64] FLAG: --logging-format="text" Apr 22 19:58:31.873952 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869699 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869702 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869705 2583 flags.go:64] FLAG: --manifest-url="" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869707 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869712 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869715 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869719 2583 flags.go:64] FLAG: --max-pods="110" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869722 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869725 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869728 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869731 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869734 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869737 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869740 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869748 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869751 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869754 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869758 2583 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869761 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869766 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869769 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869772 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869775 2583 flags.go:64] FLAG: --port="10250" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869778 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:58:31.874510 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869781 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06125d3d4e43fb06b" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869784 2583 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869787 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869790 2583 flags.go:64] FLAG: --register-node="true" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869793 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869795 2583 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869799 2583 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869802 2583 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869805 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869807 2583 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869811 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869814 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869817 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869819 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869823 2583 flags.go:64] FLAG: --runonce="false" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869826 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869828 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869831 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869834 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869837 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869840 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869843 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869846 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869849 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869851 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869854 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:58:31.875123 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869870 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869874 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869876 2583 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869879 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869885 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869890 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869893 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869897 2583 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869899 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869902 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869905 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869908 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869910 2583 flags.go:64] FLAG: --v="2" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869915 2583 flags.go:64] FLAG: --version="false" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869919 2583 flags.go:64] FLAG: --vmodule="" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869923 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.869926 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870020 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870023 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870026 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870029 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870031 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870034 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:31.875743 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870037 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870040 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870042 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870045 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870048 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870050 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870053 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870055 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870057 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870060 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870063 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870066 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870068 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870071 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870074 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870077 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870080 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870082 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870084 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:31.876424 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870087 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870089 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870092 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870094 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870097 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870099 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870102 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870104 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870107 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870110 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870112 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870115 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870117 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870120 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870123 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870125 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870128 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870131 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870133 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870136 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:31.877330 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870138 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870140 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870143 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870145 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870148 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870150 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870153 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870157 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870159 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870162 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870165 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870167 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870169 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870172 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870175 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870179 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870181 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870184 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870186 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870189 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:31.878220 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870192 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870195 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870199 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870201 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870204 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870207 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870210 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870212 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870215 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870217 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870219 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870222 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870224 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870226 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870229 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870231 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870234 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870237 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870239 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870243 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:31.879106 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.870245 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.870794 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.878427 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.878449 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878521 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878529 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878534 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878540 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878545 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878549 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878554 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878558 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878562 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878566 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878571 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878575 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:31.879847 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878579 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878583 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878590 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878597 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878601 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878605 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878609 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878614 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878619 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878624 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878628 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878632 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878636 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878642 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878646 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878650 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878654 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878658 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878662 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:31.880290 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878668 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878675 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878680 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878685 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878689 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878693 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878697 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878702 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878706 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878710 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878714 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878718 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878722 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878726 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878731 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878735 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878739 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878743 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878747 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:31.880813 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878751 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878755 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878760 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878764 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878768 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878773 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878777 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878781 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878786 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878791 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878795 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878799 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878803 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878808 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878812 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878816 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878820 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878824 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878828 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878833 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:31.881538 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878838 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878842 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878846 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878850 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878854 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878882 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878887 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878891 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878894 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878899 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878903 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878908 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878912 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878917 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878921 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:31.882055 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.878925 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.878933 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879096 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879104 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879108 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879113 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879118 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879123 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879127 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879131 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879136 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879141 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879146 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879150 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879156 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:31.882475 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879163 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879167 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879171 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879176 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879179 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879184 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879188 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879193 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879197 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879201 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879204 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879209 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879213 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879217 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879220 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879225 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879229 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879233 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879237 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879241 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:31.882902 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879245 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879250 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879254 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879258 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879263 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879268 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879272 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879276 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879280 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879284 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879288 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879292 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879296 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879300 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879304 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879308 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879312 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879316 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879320 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879324 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:31.883410 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879328 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879332 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879336 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879340 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879344 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879348 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879352 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879358 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879363 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879368 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879373 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879378 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879382 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879385 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879389 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879393 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879397 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879407 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879412 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:31.884006 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879416 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879420 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879424 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879427 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879431 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879435 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879439 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879443 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879447 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879451 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879455 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879459 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879463 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:31.879467 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.879475 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:31.884838 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.880282 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:58:31.887203 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.887186 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:58:31.888210 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.888196 2583 server.go:1019] "Starting client certificate rotation" Apr 22 19:58:31.888316 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.888300 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:58:31.888352 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.888335 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:58:31.910826 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.910807 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:58:31.913697 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.913659 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:58:31.926927 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.926905 2583 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:58:31.932818 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.932801 2583 log.go:25] "Validated CRI v1 image API" Apr 22 19:58:31.934079 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.934061 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:58:31.936094 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.936077 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:58:31.936316 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.936296 2583 fs.go:135] Filesystem UUIDs: map[5609bfcf-c5e1-4b38-b918-64e972f120a8:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a7a643ad-6bb8-48fa-8203-c5ea04a798fd:/dev/nvme0n1p4] Apr 22 19:58:31.936384 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.936318 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:58:31.942278 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.942150 2583 manager.go:217] Machine: {Timestamp:2026-04-22 19:58:31.940418725 +0000 UTC m=+0.360325508 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097328 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21ff0a8e8542e511bd708486fef734 SystemUUID:ec21ff0a-8e85-42e5-11bd-708486fef734 BootID:0cdab374-f300-423e-a2ca-76f7c9b8c2ac Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:eb:f9:5f:61:59 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:eb:f9:5f:61:59 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:68:4a:d1:b5:e2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:58:31.942278 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.942260 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:58:31.942442 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.942385 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:58:31.944587 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.944562 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:58:31.944772 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.944590 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-221.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:58:31.944852 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.944786 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:58:31.944852 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.944798 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:58:31.944852 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.944822 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:58:31.945527 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.945513 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:58:31.947096 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.947083 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:58:31.947236 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.947224 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:58:31.949270 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.949259 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:58:31.949327 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.949282 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:58:31.949327 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.949299 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:58:31.949327 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.949312 2583 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:58:31.949327 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.949327 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:58:31.950375 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.950360 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:58:31.950452 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.950387 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:58:31.953019 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.953001 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:58:31.954615 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.954601 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:58:31.956584 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956566 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:58:31.956639 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956595 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:58:31.956639 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956608 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:58:31.956639 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956619 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:58:31.956639 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956631 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:58:31.956743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956662 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:58:31.956743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956678 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:58:31.956743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956693 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:58:31.956743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956707 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:58:31.956743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956720 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:58:31.956891 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956756 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:58:31.956891 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.956787 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:58:31.957555 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.957544 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:58:31.957589 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.957556 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:58:31.959836 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.959814 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gjcjh" Apr 22 19:58:31.961575 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:31.961552 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-221.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:58:31.961575 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.961558 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-221.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:58:31.961948 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:31.961919 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:58:31.962017 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.961965 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:58:31.962075 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.962062 2583 server.go:1295] "Started kubelet" Apr 22 19:58:31.962671 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.962645 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:58:31.963144 ip-10-0-135-221 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:58:31.963789 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.963744 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:58:31.963831 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.963823 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:58:31.965048 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.965026 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:58:31.965130 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.965073 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gjcjh" Apr 22 19:58:31.965383 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.965363 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:58:31.971831 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:31.970923 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-221.ec2.internal.18a8c6243e2de7e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-221.ec2.internal,UID:ip-10-0-135-221.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-221.ec2.internal,},FirstTimestamp:2026-04-22 19:58:31.961978856 +0000 UTC m=+0.381885646,LastTimestamp:2026-04-22 19:58:31.961978856 +0000 UTC m=+0.381885646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-221.ec2.internal,}" Apr 22 19:58:31.972826 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.972802 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:58:31.973370 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:31.973344 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:58:31.973461 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.973408 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:58:31.973982 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.973962 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:58:31.974066 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.973963 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:58:31.974066 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974020 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:58:31.974066 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974046 2583 factory.go:55] Registering systemd factory Apr 22 19:58:31.974066 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974064 2583 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:58:31.974224 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974112 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:58:31.974224 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974122 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:58:31.974327 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:31.974245 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:31.974327 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974318 2583 factory.go:153] Registering CRI-O factory Apr 22 19:58:31.974327 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974329 2583 factory.go:223] Registration of the crio container factory successfully Apr 22 19:58:31.974460 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974368 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:58:31.974460 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974391 2583 factory.go:103] Registering Raw factory Apr 22 19:58:31.974460 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.974403 2583 manager.go:1196] Started watching for new ooms in manager Apr 22 19:58:31.975542 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.975523 2583 manager.go:319] Starting recovery of all containers Apr 22 19:58:31.981226 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.981195 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:58:31.983667 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.983606 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:31.986721 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:31.986696 2583 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-221.ec2.internal\" not found" node="ip-10-0-135-221.ec2.internal" Apr 22 19:58:31.986807 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.986739 2583 manager.go:324] Recovery completed Apr 22 19:58:31.990834 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.990822 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:31.993265 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.993251 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:31.993332 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.993283 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:31.993332 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.993293 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:31.994587 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.994569 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:58:31.994587 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.994584 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:58:31.994725 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.994606 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:58:31.998017 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.998001 2583 policy_none.go:49] "None policy: Start" Apr 22 19:58:31.998101 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.998021 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:58:31.998101 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:31.998034 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.040387 2583 manager.go:341] "Starting Device Plugin manager" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.040415 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.040425 2583 server.go:85] "Starting device plugin registration server" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.040647 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.040657 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.040779 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.040844 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.040854 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.041372 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:58:32.054019 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.041404 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.091285 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.091261 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:58:32.091285 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.091289 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:58:32.091435 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.091304 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:58:32.091435 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.091312 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:58:32.091435 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.091339 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:58:32.095066 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.095045 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:32.141446 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.141399 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:32.142304 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.142289 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:32.142387 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.142317 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:32.142387 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.142329 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:32.142387 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.142352 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.150632 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.150617 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.150711 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.150638 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-221.ec2.internal\": node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.190176 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.190154 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.192231 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.192213 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal"] Apr 22 19:58:32.192286 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.192280 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:32.193193 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.193163 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:32.193269 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.193208 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:32.193269 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.193219 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:32.195545 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.195515 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:32.195679 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.195651 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.195726 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.195697 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:32.196245 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.196230 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:32.196245 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.196231 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:32.196349 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.196266 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:32.196349 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.196268 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:32.196349 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.196277 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:32.196349 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.196282 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:32.198481 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.198467 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.198527 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.198492 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:32.199182 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.199168 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:32.199251 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.199198 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:32.199251 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.199215 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:32.226393 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.226369 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-221.ec2.internal\" not found" node="ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.230815 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.230800 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-221.ec2.internal\" not found" node="ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.275457 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.275431 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a35607486dd92a2cad3be4c453bea29b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal\" (UID: \"a35607486dd92a2cad3be4c453bea29b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.275557 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.275462 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a35607486dd92a2cad3be4c453bea29b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal\" (UID: \"a35607486dd92a2cad3be4c453bea29b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.275557 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.275485 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3c3427913a40e2eb667593f4c197b1b3-config\") pod \"kube-apiserver-proxy-ip-10-0-135-221.ec2.internal\" (UID: \"3c3427913a40e2eb667593f4c197b1b3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.291027 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.291006 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.376332 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.376301 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a35607486dd92a2cad3be4c453bea29b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal\" (UID: \"a35607486dd92a2cad3be4c453bea29b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.376445 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.376345 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a35607486dd92a2cad3be4c453bea29b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal\" (UID: \"a35607486dd92a2cad3be4c453bea29b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.376445 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.376393 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3c3427913a40e2eb667593f4c197b1b3-config\") pod \"kube-apiserver-proxy-ip-10-0-135-221.ec2.internal\" (UID: \"3c3427913a40e2eb667593f4c197b1b3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.376445 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.376396 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a35607486dd92a2cad3be4c453bea29b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal\" (UID: \"a35607486dd92a2cad3be4c453bea29b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.376445 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.376415 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a35607486dd92a2cad3be4c453bea29b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal\" (UID: \"a35607486dd92a2cad3be4c453bea29b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.376561 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.376450 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3c3427913a40e2eb667593f4c197b1b3-config\") pod \"kube-apiserver-proxy-ip-10-0-135-221.ec2.internal\" (UID: \"3c3427913a40e2eb667593f4c197b1b3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.391427 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.391402 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.492288 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.492216 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.528433 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.528401 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.534008 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.533990 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" Apr 22 19:58:32.593240 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.593189 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.693676 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.693646 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.791984 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.791903 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:32.794018 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.794000 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.888540 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.888509 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:58:32.889088 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.888633 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:58:32.889088 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.888643 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:58:32.889088 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.888660 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:58:32.894815 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.894795 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:32.967880 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.967819 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:53:31 +0000 UTC" deadline="2027-09-27 02:23:33.462137687 +0000 UTC" Apr 22 19:58:32.967880 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.967876 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12534h25m0.49428206s" Apr 22 19:58:32.973161 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.973144 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:58:32.983964 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:32.983943 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:58:32.995623 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:32.995598 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-221.ec2.internal\" not found" Apr 22 19:58:33.003341 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.003324 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-prxcf" Apr 22 19:58:33.011347 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.011321 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-prxcf" Apr 22 19:58:33.034577 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.034555 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:33.074667 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.074628 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" Apr 22 19:58:33.086444 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.086422 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:58:33.087333 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.087320 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" Apr 22 19:58:33.099066 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.099032 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:58:33.183421 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:33.183390 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda35607486dd92a2cad3be4c453bea29b.slice/crio-82551d6a632ea4245a7e07aa4e122b9d9c3770732008a49264d581fef57763f5 WatchSource:0}: Error finding container 82551d6a632ea4245a7e07aa4e122b9d9c3770732008a49264d581fef57763f5: Status 404 returned error can't find the container with id 82551d6a632ea4245a7e07aa4e122b9d9c3770732008a49264d581fef57763f5 Apr 22 19:58:33.183942 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:33.183921 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3427913a40e2eb667593f4c197b1b3.slice/crio-858cac969abe130332b7f775405cf13fdd8d62bc5a3b4e6ccb2228acee84a58c WatchSource:0}: Error finding container 858cac969abe130332b7f775405cf13fdd8d62bc5a3b4e6ccb2228acee84a58c: Status 404 returned error can't find the container with id 858cac969abe130332b7f775405cf13fdd8d62bc5a3b4e6ccb2228acee84a58c Apr 22 19:58:33.188221 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.188207 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:58:33.934460 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.934380 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:33.951197 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.951170 2583 apiserver.go:52] "Watching apiserver" Apr 22 19:58:33.956894 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.956554 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:58:33.959274 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.959239 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wvr6d","openshift-multus/network-metrics-daemon-q6lbk","openshift-network-diagnostics/network-check-target-rgxkv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh","openshift-cluster-node-tuning-operator/tuned-lbzbf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal","openshift-multus/multus-additional-cni-plugins-x9bbm","openshift-network-operator/iptables-alerter-2vqmz","openshift-ovn-kubernetes/ovnkube-node-ws7ww","kube-system/konnectivity-agent-zsbfm","kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal","openshift-dns/node-resolver-w5slr","openshift-image-registry/node-ca-gvfnk"] Apr 22 19:58:33.962275 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.962252 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.964401 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.964373 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:58:33.964543 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.964520 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:58:33.964727 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.964703 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:58:33.964727 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.964719 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:58:33.964945 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.964777 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:58:33.964945 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.964724 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hpcg5\"" Apr 22 19:58:33.965398 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.965381 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:33.965479 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:33.965457 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:33.969587 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.969571 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:33.969668 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:33.969630 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:33.972029 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.972009 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:33.972518 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.972054 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.973929 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.973910 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:58:33.974077 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.974066 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-snqz7\"" Apr 22 19:58:33.974172 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.974162 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:33.974343 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.974332 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:58:33.974494 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.974484 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:33.974632 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.974610 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:58:33.974729 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.974645 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fg666\"" Apr 22 19:58:33.978843 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.978824 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.980939 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.980757 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:58:33.981334 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.981314 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:33.981532 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.981465 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:33.982512 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.982493 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kcjxn\"" Apr 22 19:58:33.983670 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.983410 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:58:33.983670 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.983573 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fxvrl\"" Apr 22 19:58:33.983670 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.983605 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:58:33.984020 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984001 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:33.984159 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984139 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:58:33.984226 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984213 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:58:33.984315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984293 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:33.984410 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984343 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:58:33.984472 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984293 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:58:33.984472 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984428 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-j5lfq\"" Apr 22 19:58:33.984568 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984556 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:58:33.985008 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.984986 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-kubelet\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985099 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985019 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-system-cni-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985099 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985042 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-os-release\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985099 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985068 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-cni-bin\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985099 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985092 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-cni-multus\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985115 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-conf-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985150 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985173 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-cni-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985199 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhzcc\" (UniqueName: \"kubernetes.io/projected/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-kube-api-access-lhzcc\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985222 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-cnibin\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985247 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trzvs\" (UniqueName: \"kubernetes.io/projected/1be306f2-45b8-43c3-9302-dce9d9ac5650-kube-api-access-trzvs\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985269 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysctl-d\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985290 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-var-lib-kubelet\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985312 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs57c\" (UniqueName: \"kubernetes.io/projected/cc26712b-0e5e-4820-916c-7d4a26dc15f2-kube-api-access-qs57c\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.985342 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985336 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-os-release\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985358 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-cni-binary-copy\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985398 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-hostroot\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985423 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-multus-certs\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985459 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985483 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985519 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-modprobe-d\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985540 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-lib-modules\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985575 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-etc-kubernetes\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985627 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqmcf\" (UniqueName: \"kubernetes.io/projected/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-kube-api-access-dqmcf\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985660 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-host-slash\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985696 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985721 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:33.985743 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985744 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-device-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985776 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-iptables-alerter-script\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985807 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985835 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-socket-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985894 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-registration-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985917 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-tuned\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985954 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.985976 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysctl-conf\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986009 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-systemd\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986031 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-sys\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986054 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986055 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-host\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986267 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-socket-dir-parent\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986295 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-netns\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.986322 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986317 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-daemon-config\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986350 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-system-cni-dir\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986392 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx79w\" (UniqueName: \"kubernetes.io/projected/fbf58ad5-56ae-4535-a07f-980865760128-kube-api-access-nx79w\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986421 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-sys-fs\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986447 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ghnw\" (UniqueName: \"kubernetes.io/projected/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-kube-api-access-4ghnw\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986472 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-cnibin\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986495 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-k8s-cni-cncf-io\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986525 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-cni-binary-copy\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986550 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysconfig\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986582 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-kubernetes\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986603 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-run\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.986913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.986625 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc26712b-0e5e-4820-916c-7d4a26dc15f2-tmp\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:33.987910 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.987888 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:58:33.988004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.987940 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:58:33.988059 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.988049 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wfwz5\"" Apr 22 19:58:33.988670 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.988633 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:33.988670 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.988655 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:33.990579 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.990547 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:58:33.990660 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.990617 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:58:33.990660 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.990641 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:58:33.990660 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.990656 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:58:33.990773 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.990623 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6c4lr\"" Apr 22 19:58:33.990773 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.990617 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cktzz\"" Apr 22 19:58:33.990934 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:33.990912 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:58:34.014200 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.014154 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:53:33 +0000 UTC" deadline="2028-01-16 11:10:45.61501192 +0000 UTC" Apr 22 19:58:34.014200 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.014198 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15207h12m11.600817142s" Apr 22 19:58:34.074833 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.074798 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:58:34.087561 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087531 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-device-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.087733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087569 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.087733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087594 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-socket-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.087733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087623 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-slash\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.087733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087646 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ab1e4670-372d-4a67-810d-77a48d25a47d-konnectivity-ca\") pod \"konnectivity-agent-zsbfm\" (UID: \"ab1e4670-372d-4a67-810d-77a48d25a47d\") " pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:34.087733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087669 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-sys\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.087733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087679 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-device-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.087733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087695 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-host\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.087733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087718 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-etc-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087741 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-ovn\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087776 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-netns\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087800 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-daemon-config\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087856 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-system-cni-dir\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087899 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nx79w\" (UniqueName: \"kubernetes.io/projected/fbf58ad5-56ae-4535-a07f-980865760128-kube-api-access-nx79w\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087908 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087925 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-sys-fs\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087964 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ghnw\" (UniqueName: \"kubernetes.io/projected/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-kube-api-access-4ghnw\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.087993 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088015 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-socket-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088024 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysconfig\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088051 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-run\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088062 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-netns\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088076 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-var-lib-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088102 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovnkube-config\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088113 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-sys\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.088153 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088127 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-kubelet\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088147 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-host\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088156 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-cni-bin\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088183 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-cni-multus\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088218 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-kubelet\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088237 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovnkube-script-lib\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088255 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntbqj\" (UniqueName: \"kubernetes.io/projected/363c0120-0870-4e33-8e8c-f6eeb68a30f9-kube-api-access-ntbqj\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088277 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-cni-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088294 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhzcc\" (UniqueName: \"kubernetes.io/projected/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-kube-api-access-lhzcc\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088310 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs57c\" (UniqueName: \"kubernetes.io/projected/cc26712b-0e5e-4820-916c-7d4a26dc15f2-kube-api-access-qs57c\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088326 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-hostroot\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088342 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-multus-certs\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088393 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088408 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqmcf\" (UniqueName: \"kubernetes.io/projected/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-kube-api-access-dqmcf\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088425 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-env-overrides\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088442 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-iptables-alerter-script\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088457 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-registration-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.088924 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088471 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-tuned\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088487 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-cni-bin\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088504 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsrtf\" (UniqueName: \"kubernetes.io/projected/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-kube-api-access-fsrtf\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088539 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088562 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysctl-conf\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088762 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-daemon-config\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088840 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-cni-bin\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088908 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-system-cni-dir\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088965 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-cni-multus\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.088998 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-multus-certs\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089059 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089065 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysctl-conf\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089102 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-registration-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089109 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-systemd\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089129 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-sys-fs\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089139 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-log-socket\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089186 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-systemd\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.089644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089190 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089235 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-cni-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089288 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-cni-netd\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089320 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovn-node-metrics-cert\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089309 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089340 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-run\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089350 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-hostroot\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089354 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg6sp\" (UniqueName: \"kubernetes.io/projected/01431c0e-d992-47b0-b2db-613b46bfb3ba-kube-api-access-gg6sp\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089396 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-kubernetes\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089420 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-var-lib-kubelet\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089429 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-socket-dir-parent\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089476 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-iptables-alerter-script\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089488 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-socket-dir-parent\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089517 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysconfig\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089587 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-kubernetes\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089602 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-systemd\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089640 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-hosts-file\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089675 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-tmp-dir\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.090323 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089710 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-cnibin\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089747 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-k8s-cni-cncf-io\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089758 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-cnibin\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089781 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-cni-binary-copy\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089810 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc26712b-0e5e-4820-916c-7d4a26dc15f2-tmp\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089812 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-host-run-k8s-cni-cncf-io\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089891 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01431c0e-d992-47b0-b2db-613b46bfb3ba-host\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089930 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-system-cni-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089954 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-os-release\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.089985 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090034 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-conf-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090067 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-os-release\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090070 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090107 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-cnibin\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090135 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trzvs\" (UniqueName: \"kubernetes.io/projected/1be306f2-45b8-43c3-9302-dce9d9ac5650-kube-api-access-trzvs\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090177 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysctl-d\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090223 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-var-lib-kubelet\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.091315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090251 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-systemd-units\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090276 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-run-netns\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090296 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-system-cni-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090310 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1be306f2-45b8-43c3-9302-dce9d9ac5650-cni-binary-copy\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090303 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090395 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-cnibin\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090535 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-sysctl-d\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090650 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-multus-conf-dir\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090655 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-os-release\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090672 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-var-lib-kubelet\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090745 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1be306f2-45b8-43c3-9302-dce9d9ac5650-os-release\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090765 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-cni-binary-copy\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090816 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090904 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-modprobe-d\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090929 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-lib-modules\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.090979 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-node-log\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091005 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ab1e4670-372d-4a67-810d-77a48d25a47d-agent-certs\") pod \"konnectivity-agent-zsbfm\" (UID: \"ab1e4670-372d-4a67-810d-77a48d25a47d\") " pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:34.092069 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091028 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01431c0e-d992-47b0-b2db-613b46bfb3ba-serviceca\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091132 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-lib-modules\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091211 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091286 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-modprobe-d\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091325 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-cni-binary-copy\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091349 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-etc-kubernetes\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091399 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-etc-kubernetes\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091432 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-host-slash\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091474 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091526 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-host-slash\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091559 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.091678 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.091727 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.092821 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.091805 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs podName:fbf58ad5-56ae-4535-a07f-980865760128 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.591785879 +0000 UTC m=+3.011692651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs") pod "network-metrics-daemon-q6lbk" (UID: "fbf58ad5-56ae-4535-a07f-980865760128") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:34.093458 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.093125 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cc26712b-0e5e-4820-916c-7d4a26dc15f2-etc-tuned\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.095882 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.095820 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" event={"ID":"3c3427913a40e2eb667593f4c197b1b3","Type":"ContainerStarted","Data":"858cac969abe130332b7f775405cf13fdd8d62bc5a3b4e6ccb2228acee84a58c"} Apr 22 19:58:34.096954 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.096929 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" event={"ID":"a35607486dd92a2cad3be4c453bea29b","Type":"ContainerStarted","Data":"82551d6a632ea4245a7e07aa4e122b9d9c3770732008a49264d581fef57763f5"} Apr 22 19:58:34.097243 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.097223 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc26712b-0e5e-4820-916c-7d4a26dc15f2-tmp\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.109190 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.109107 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs57c\" (UniqueName: \"kubernetes.io/projected/cc26712b-0e5e-4820-916c-7d4a26dc15f2-kube-api-access-qs57c\") pod \"tuned-lbzbf\" (UID: \"cc26712b-0e5e-4820-916c-7d4a26dc15f2\") " pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.115650 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.115625 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhzcc\" (UniqueName: \"kubernetes.io/projected/ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f-kube-api-access-lhzcc\") pod \"iptables-alerter-2vqmz\" (UID: \"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f\") " pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:34.119779 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.119757 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:34.119779 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.119781 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:34.119977 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.119793 2583 projected.go:194] Error preparing data for projected volume kube-api-access-rw7zl for pod openshift-network-diagnostics/network-check-target-rgxkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:34.119977 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.119844 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl podName:428fb0f0-657f-42fe-874e-120700caf3c2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.619829999 +0000 UTC m=+3.039736772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rw7zl" (UniqueName: "kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl") pod "network-check-target-rgxkv" (UID: "428fb0f0-657f-42fe-874e-120700caf3c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:34.120251 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.120228 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trzvs\" (UniqueName: \"kubernetes.io/projected/1be306f2-45b8-43c3-9302-dce9d9ac5650-kube-api-access-trzvs\") pod \"multus-additional-cni-plugins-x9bbm\" (UID: \"1be306f2-45b8-43c3-9302-dce9d9ac5650\") " pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.120996 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.120975 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx79w\" (UniqueName: \"kubernetes.io/projected/fbf58ad5-56ae-4535-a07f-980865760128-kube-api-access-nx79w\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:34.122163 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.122145 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqmcf\" (UniqueName: \"kubernetes.io/projected/d9a2d7b0-a64a-4b9d-9b12-11745c8dde45-kube-api-access-dqmcf\") pod \"multus-wvr6d\" (UID: \"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45\") " pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.122527 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.122510 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ghnw\" (UniqueName: \"kubernetes.io/projected/80f4b313-2b5e-45df-b4a7-4e3e8651d1ad-kube-api-access-4ghnw\") pod \"aws-ebs-csi-driver-node-pv2vh\" (UID: \"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.192133 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192046 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-etc-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192133 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192090 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-ovn\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192169 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-etc-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192230 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192243 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-ovn\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192267 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-var-lib-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192284 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovnkube-config\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192297 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192302 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-kubelet\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192327 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-var-lib-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192343 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-kubelet\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192348 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovnkube-script-lib\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192384 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntbqj\" (UniqueName: \"kubernetes.io/projected/363c0120-0870-4e33-8e8c-f6eeb68a30f9-kube-api-access-ntbqj\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192415 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-env-overrides\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192574 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-cni-bin\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.192644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsrtf\" (UniqueName: \"kubernetes.io/projected/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-kube-api-access-fsrtf\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.192644 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192636 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-log-socket\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192662 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192680 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-cni-bin\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192687 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-cni-netd\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192712 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovn-node-metrics-cert\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192750 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192833 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-env-overrides\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192911 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-cni-netd\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192921 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovnkube-config\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192954 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-log-socket\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.192985 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg6sp\" (UniqueName: \"kubernetes.io/projected/01431c0e-d992-47b0-b2db-613b46bfb3ba-kube-api-access-gg6sp\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193022 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-systemd\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193055 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-systemd\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193065 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-hosts-file\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193085 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovnkube-script-lib\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193099 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-tmp-dir\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193128 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-hosts-file\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01431c0e-d992-47b0-b2db-613b46bfb3ba-host\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193204 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-systemd-units\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193196 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01431c0e-d992-47b0-b2db-613b46bfb3ba-host\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193231 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-run-netns\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193255 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193257 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-systemd-units\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193284 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-node-log\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193296 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-run-netns\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193308 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ab1e4670-372d-4a67-810d-77a48d25a47d-agent-certs\") pod \"konnectivity-agent-zsbfm\" (UID: \"ab1e4670-372d-4a67-810d-77a48d25a47d\") " pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193331 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-run-openvswitch\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193332 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01431c0e-d992-47b0-b2db-613b46bfb3ba-serviceca\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.193455 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193368 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-node-log\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.194204 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193393 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-slash\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.194204 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193403 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-tmp-dir\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.194204 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193421 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ab1e4670-372d-4a67-810d-77a48d25a47d-konnectivity-ca\") pod \"konnectivity-agent-zsbfm\" (UID: \"ab1e4670-372d-4a67-810d-77a48d25a47d\") " pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:34.194204 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193466 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/363c0120-0870-4e33-8e8c-f6eeb68a30f9-host-slash\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.194204 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193728 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01431c0e-d992-47b0-b2db-613b46bfb3ba-serviceca\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.194204 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.193971 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ab1e4670-372d-4a67-810d-77a48d25a47d-konnectivity-ca\") pod \"konnectivity-agent-zsbfm\" (UID: \"ab1e4670-372d-4a67-810d-77a48d25a47d\") " pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:34.195313 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.195293 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/363c0120-0870-4e33-8e8c-f6eeb68a30f9-ovn-node-metrics-cert\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.195986 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.195967 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ab1e4670-372d-4a67-810d-77a48d25a47d-agent-certs\") pod \"konnectivity-agent-zsbfm\" (UID: \"ab1e4670-372d-4a67-810d-77a48d25a47d\") " pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:34.200721 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.200699 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntbqj\" (UniqueName: \"kubernetes.io/projected/363c0120-0870-4e33-8e8c-f6eeb68a30f9-kube-api-access-ntbqj\") pod \"ovnkube-node-ws7ww\" (UID: \"363c0120-0870-4e33-8e8c-f6eeb68a30f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.201130 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.201107 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg6sp\" (UniqueName: \"kubernetes.io/projected/01431c0e-d992-47b0-b2db-613b46bfb3ba-kube-api-access-gg6sp\") pod \"node-ca-gvfnk\" (UID: \"01431c0e-d992-47b0-b2db-613b46bfb3ba\") " pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.201289 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.201261 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsrtf\" (UniqueName: \"kubernetes.io/projected/0cb9503d-e2e9-4f70-97aa-e8fa372598fc-kube-api-access-fsrtf\") pod \"node-resolver-w5slr\" (UID: \"0cb9503d-e2e9-4f70-97aa-e8fa372598fc\") " pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.273257 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.273215 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" Apr 22 19:58:34.286264 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.286227 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" Apr 22 19:58:34.293958 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.293937 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" Apr 22 19:58:34.299662 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.299637 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wvr6d" Apr 22 19:58:34.307240 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.307222 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2vqmz" Apr 22 19:58:34.323802 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.323780 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:34.330503 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.330473 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:34.338011 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.337994 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5slr" Apr 22 19:58:34.343550 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.343525 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gvfnk" Apr 22 19:58:34.361375 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.361352 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:34.595707 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.595618 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:34.595987 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.595734 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:34.595987 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.595803 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs podName:fbf58ad5-56ae-4535-a07f-980865760128 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:35.595786596 +0000 UTC m=+4.015693369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs") pod "network-metrics-daemon-q6lbk" (UID: "fbf58ad5-56ae-4535-a07f-980865760128") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:34.677087 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.677059 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:34.696428 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:34.696393 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:34.696609 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.696562 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:34.696609 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.696583 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:34.696609 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.696593 2583 projected.go:194] Error preparing data for projected volume kube-api-access-rw7zl for pod openshift-network-diagnostics/network-check-target-rgxkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:34.696735 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:34.696644 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl podName:428fb0f0-657f-42fe-874e-120700caf3c2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:35.696628304 +0000 UTC m=+4.116535078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rw7zl" (UniqueName: "kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl") pod "network-check-target-rgxkv" (UID: "428fb0f0-657f-42fe-874e-120700caf3c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:34.986329 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:34.986296 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9a2d7b0_a64a_4b9d_9b12_11745c8dde45.slice/crio-a4818f8896c58ea2f36fb05b460c54dcd40ed60ba22553aa84a9d684e80ce39e WatchSource:0}: Error finding container a4818f8896c58ea2f36fb05b460c54dcd40ed60ba22553aa84a9d684e80ce39e: Status 404 returned error can't find the container with id a4818f8896c58ea2f36fb05b460c54dcd40ed60ba22553aa84a9d684e80ce39e Apr 22 19:58:34.987712 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:34.987683 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01431c0e_d992_47b0_b2db_613b46bfb3ba.slice/crio-b9bd00f0bd20f69c6ccd704ec47a870619bf74c3cab0eebc61b1a6c5cde65343 WatchSource:0}: Error finding container b9bd00f0bd20f69c6ccd704ec47a870619bf74c3cab0eebc61b1a6c5cde65343: Status 404 returned error can't find the container with id b9bd00f0bd20f69c6ccd704ec47a870619bf74c3cab0eebc61b1a6c5cde65343 Apr 22 19:58:34.991227 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:34.991184 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb9503d_e2e9_4f70_97aa_e8fa372598fc.slice/crio-8b452bbe7b24e8ccde3d8002829ea0415d7476ed1416c495a23ef20bc1b013db WatchSource:0}: Error finding container 8b452bbe7b24e8ccde3d8002829ea0415d7476ed1416c495a23ef20bc1b013db: Status 404 returned error can't find the container with id 8b452bbe7b24e8ccde3d8002829ea0415d7476ed1416c495a23ef20bc1b013db Apr 22 19:58:34.992152 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:34.992128 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be306f2_45b8_43c3_9302_dce9d9ac5650.slice/crio-55ca50d2e7e0c58c9c2846902fc3a518f844f64a86c91b3aa5651e518f251cff WatchSource:0}: Error finding container 55ca50d2e7e0c58c9c2846902fc3a518f844f64a86c91b3aa5651e518f251cff: Status 404 returned error can't find the container with id 55ca50d2e7e0c58c9c2846902fc3a518f844f64a86c91b3aa5651e518f251cff Apr 22 19:58:35.000126 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:58:34.999958 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80f4b313_2b5e_45df_b4a7_4e3e8651d1ad.slice/crio-3030182b590d7dfd6c87593947721e2defd99f36f2cb009656a8c57f3c9ddadb WatchSource:0}: Error finding container 3030182b590d7dfd6c87593947721e2defd99f36f2cb009656a8c57f3c9ddadb: Status 404 returned error can't find the container with id 3030182b590d7dfd6c87593947721e2defd99f36f2cb009656a8c57f3c9ddadb Apr 22 19:58:35.014634 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.014607 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:53:33 +0000 UTC" deadline="2027-12-23 03:52:01.720310966 +0000 UTC" Apr 22 19:58:35.014634 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.014632 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14623h53m26.705680955s" Apr 22 19:58:35.100175 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.099990 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" event={"ID":"cc26712b-0e5e-4820-916c-7d4a26dc15f2","Type":"ContainerStarted","Data":"318190c1eafe5180edd9211d088e5aaac80622e5c00694baa25f9939e29a0583"} Apr 22 19:58:35.100971 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.100949 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5slr" event={"ID":"0cb9503d-e2e9-4f70-97aa-e8fa372598fc","Type":"ContainerStarted","Data":"8b452bbe7b24e8ccde3d8002829ea0415d7476ed1416c495a23ef20bc1b013db"} Apr 22 19:58:35.101837 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.101812 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gvfnk" event={"ID":"01431c0e-d992-47b0-b2db-613b46bfb3ba","Type":"ContainerStarted","Data":"b9bd00f0bd20f69c6ccd704ec47a870619bf74c3cab0eebc61b1a6c5cde65343"} Apr 22 19:58:35.102705 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.102672 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvr6d" event={"ID":"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45","Type":"ContainerStarted","Data":"a4818f8896c58ea2f36fb05b460c54dcd40ed60ba22553aa84a9d684e80ce39e"} Apr 22 19:58:35.104199 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.104176 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" event={"ID":"3c3427913a40e2eb667593f4c197b1b3","Type":"ContainerStarted","Data":"f4fdf0074619fd73309a3e1e2898abd12301bffbbdeb2c1acb3702e1a0401d1c"} Apr 22 19:58:35.105162 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.105132 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" event={"ID":"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad","Type":"ContainerStarted","Data":"3030182b590d7dfd6c87593947721e2defd99f36f2cb009656a8c57f3c9ddadb"} Apr 22 19:58:35.106031 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.106012 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"b173ec703236f81cb79f662ff014d6ce590c29e4d0243e28f0081e589030decb"} Apr 22 19:58:35.106936 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.106916 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2vqmz" event={"ID":"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f","Type":"ContainerStarted","Data":"a473ed47579fefc6518e9bd0af877e64e7a95db850462c82a683c6650098bf6b"} Apr 22 19:58:35.107845 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.107828 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" event={"ID":"1be306f2-45b8-43c3-9302-dce9d9ac5650","Type":"ContainerStarted","Data":"55ca50d2e7e0c58c9c2846902fc3a518f844f64a86c91b3aa5651e518f251cff"} Apr 22 19:58:35.108713 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.108687 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zsbfm" event={"ID":"ab1e4670-372d-4a67-810d-77a48d25a47d","Type":"ContainerStarted","Data":"e72042d96c10f4dbf950da81ae8121ee1384e47504ba948f564139d828b880df"} Apr 22 19:58:35.612881 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.607508 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:35.617880 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:35.613234 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:35.617880 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:35.613323 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs podName:fbf58ad5-56ae-4535-a07f-980865760128 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:37.613302114 +0000 UTC m=+6.033208901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs") pod "network-metrics-daemon-q6lbk" (UID: "fbf58ad5-56ae-4535-a07f-980865760128") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:35.712899 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:35.712833 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:35.713059 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:35.713044 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:35.713120 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:35.713063 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:35.713120 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:35.713076 2583 projected.go:194] Error preparing data for projected volume kube-api-access-rw7zl for pod openshift-network-diagnostics/network-check-target-rgxkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:35.713263 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:35.713137 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl podName:428fb0f0-657f-42fe-874e-120700caf3c2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:37.713117141 +0000 UTC m=+6.133023931 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rw7zl" (UniqueName: "kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl") pod "network-check-target-rgxkv" (UID: "428fb0f0-657f-42fe-874e-120700caf3c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:36.098103 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:36.097843 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:36.098103 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:36.097984 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:36.098582 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:36.098414 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:36.098582 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:36.098513 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:36.124947 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:36.124010 2583 generic.go:358] "Generic (PLEG): container finished" podID="a35607486dd92a2cad3be4c453bea29b" containerID="63881bbaa5011f9f6f49fb4845ca57c994d5c33c62a1f2ceb810589883ff5ef9" exitCode=0 Apr 22 19:58:36.124947 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:36.124937 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" event={"ID":"a35607486dd92a2cad3be4c453bea29b","Type":"ContainerDied","Data":"63881bbaa5011f9f6f49fb4845ca57c994d5c33c62a1f2ceb810589883ff5ef9"} Apr 22 19:58:36.145036 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:36.144985 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-221.ec2.internal" podStartSLOduration=3.14496743 podStartE2EDuration="3.14496743s" podCreationTimestamp="2026-04-22 19:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:35.118525574 +0000 UTC m=+3.538432365" watchObservedRunningTime="2026-04-22 19:58:36.14496743 +0000 UTC m=+4.564874221" Apr 22 19:58:37.139886 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:37.139559 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" event={"ID":"a35607486dd92a2cad3be4c453bea29b","Type":"ContainerStarted","Data":"5df358cc5b1309620970c5ee383578f5abfbc40d1edf051cca74964536a69aed"} Apr 22 19:58:37.630704 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:37.630106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:37.630704 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:37.630263 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:37.630704 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:37.630328 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs podName:fbf58ad5-56ae-4535-a07f-980865760128 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:41.630308427 +0000 UTC m=+10.050215199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs") pod "network-metrics-daemon-q6lbk" (UID: "fbf58ad5-56ae-4535-a07f-980865760128") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:37.731098 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:37.731002 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:37.731248 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:37.731196 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:37.731248 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:37.731224 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:37.731248 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:37.731238 2583 projected.go:194] Error preparing data for projected volume kube-api-access-rw7zl for pod openshift-network-diagnostics/network-check-target-rgxkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:37.731352 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:37.731297 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl podName:428fb0f0-657f-42fe-874e-120700caf3c2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:41.731277765 +0000 UTC m=+10.151184558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rw7zl" (UniqueName: "kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl") pod "network-check-target-rgxkv" (UID: "428fb0f0-657f-42fe-874e-120700caf3c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:38.096085 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:38.096009 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:38.096284 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:38.096133 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:38.096544 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:38.096524 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:38.096649 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:38.096628 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:40.095975 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:40.095947 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:40.095975 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:40.095968 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:40.096440 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:40.096078 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:40.096440 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:40.096158 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:41.663107 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:41.663068 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:41.663523 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:41.663191 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:41.663523 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:41.663258 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs podName:fbf58ad5-56ae-4535-a07f-980865760128 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:49.663240421 +0000 UTC m=+18.083147194 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs") pod "network-metrics-daemon-q6lbk" (UID: "fbf58ad5-56ae-4535-a07f-980865760128") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:41.764448 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:41.764402 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:41.764621 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:41.764590 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:41.764621 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:41.764613 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:41.764621 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:41.764622 2583 projected.go:194] Error preparing data for projected volume kube-api-access-rw7zl for pod openshift-network-diagnostics/network-check-target-rgxkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:41.764767 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:41.764671 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl podName:428fb0f0-657f-42fe-874e-120700caf3c2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:49.76465272 +0000 UTC m=+18.184559693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rw7zl" (UniqueName: "kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl") pod "network-check-target-rgxkv" (UID: "428fb0f0-657f-42fe-874e-120700caf3c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:42.093626 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:42.092056 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:42.093626 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:42.092194 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:42.093626 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:42.093445 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:42.093626 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:42.093535 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:44.094695 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:44.094661 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:44.095162 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:44.094671 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:44.095162 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:44.094793 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:44.095162 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:44.094875 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:46.095449 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:46.095407 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:46.095944 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:46.095521 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:46.095992 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:46.095980 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:46.096086 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:46.096062 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:48.092062 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:48.092029 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:48.092448 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:48.092073 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:48.092448 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:48.092164 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:48.092448 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:48.092299 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:49.725210 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:49.725170 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:49.725682 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:49.725354 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:49.725682 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:49.725434 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs podName:fbf58ad5-56ae-4535-a07f-980865760128 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.725412461 +0000 UTC m=+34.145319239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs") pod "network-metrics-daemon-q6lbk" (UID: "fbf58ad5-56ae-4535-a07f-980865760128") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:49.826240 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:49.826202 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:49.826434 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:49.826355 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:49.826434 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:49.826379 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:49.826434 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:49.826391 2583 projected.go:194] Error preparing data for projected volume kube-api-access-rw7zl for pod openshift-network-diagnostics/network-check-target-rgxkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:49.826585 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:49.826454 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl podName:428fb0f0-657f-42fe-874e-120700caf3c2 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.826435792 +0000 UTC m=+34.246342561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rw7zl" (UniqueName: "kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl") pod "network-check-target-rgxkv" (UID: "428fb0f0-657f-42fe-874e-120700caf3c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:50.091708 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:50.091624 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:50.091899 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:50.091622 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:50.091899 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:50.091783 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:50.091899 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:50.091848 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:52.093052 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:52.093017 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:52.093496 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:52.093130 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:52.093496 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:52.093193 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:52.093496 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:52.093290 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:53.167913 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.167701 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" event={"ID":"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad","Type":"ContainerStarted","Data":"962ced05dbe880e400382eca9d87469686a2b99db7670cea8d46f34f064ff6d7"} Apr 22 19:58:53.171744 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.171669 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"4be9ab5363dc5597a48d0961401e8ab2516c845cf858b21f69e68acb77be2a4b"} Apr 22 19:58:53.171744 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.171704 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"e51a6f42c8b4088ffe2302aa8c832035b79db8f854f8ccc44eb295d0c7ef4ef1"} Apr 22 19:58:53.171744 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.171713 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"c9d73740468de8b034bf87b8d985c274e42476809e98ecb52af7ece33b7b3ed1"} Apr 22 19:58:53.173568 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.173541 2583 generic.go:358] "Generic (PLEG): container finished" podID="1be306f2-45b8-43c3-9302-dce9d9ac5650" containerID="bcb4313b56bfa4ac1008bdd6e38e192372552f6803893a1b9696f098e19f9f56" exitCode=0 Apr 22 19:58:53.173665 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.173578 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" event={"ID":"1be306f2-45b8-43c3-9302-dce9d9ac5650","Type":"ContainerDied","Data":"bcb4313b56bfa4ac1008bdd6e38e192372552f6803893a1b9696f098e19f9f56"} Apr 22 19:58:53.175262 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.175236 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zsbfm" event={"ID":"ab1e4670-372d-4a67-810d-77a48d25a47d","Type":"ContainerStarted","Data":"66ed36631b9022180632faf94c949231158aca9a8d9727750c873dbdb79214f6"} Apr 22 19:58:53.178824 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.178742 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" event={"ID":"cc26712b-0e5e-4820-916c-7d4a26dc15f2","Type":"ContainerStarted","Data":"aaae4b011eecc6316c0ff7e0e7ec1e214bfdf26ed6eeba01736745ab448bc1a1"} Apr 22 19:58:53.180106 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.180073 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5slr" event={"ID":"0cb9503d-e2e9-4f70-97aa-e8fa372598fc","Type":"ContainerStarted","Data":"7b60c96fe620432d33e8829eb21df31e3c8b2764f650fad6d957d6b0ce5e6302"} Apr 22 19:58:53.181315 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.181296 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gvfnk" event={"ID":"01431c0e-d992-47b0-b2db-613b46bfb3ba","Type":"ContainerStarted","Data":"50885f31e16f7526e89578c7314380cccd149ffa0ec27a3b2f7f7c884598bb17"} Apr 22 19:58:53.182656 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.182639 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wvr6d" event={"ID":"d9a2d7b0-a64a-4b9d-9b12-11745c8dde45","Type":"ContainerStarted","Data":"ae56e58ed6fc20ba6414a49653b93087d41b6fa15ab96690afb1dfb020fe521e"} Apr 22 19:58:53.194590 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.194470 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-221.ec2.internal" podStartSLOduration=20.194456919 podStartE2EDuration="20.194456919s" podCreationTimestamp="2026-04-22 19:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:37.155198555 +0000 UTC m=+5.575105349" watchObservedRunningTime="2026-04-22 19:58:53.194456919 +0000 UTC m=+21.614363710" Apr 22 19:58:53.209921 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.209883 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lbzbf" podStartSLOduration=3.734037324 podStartE2EDuration="21.209848329s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:34.998791507 +0000 UTC m=+3.418698275" lastFinishedPulling="2026-04-22 19:58:52.474602496 +0000 UTC m=+20.894509280" observedRunningTime="2026-04-22 19:58:53.209816702 +0000 UTC m=+21.629723493" watchObservedRunningTime="2026-04-22 19:58:53.209848329 +0000 UTC m=+21.629755120" Apr 22 19:58:53.224740 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.224698 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wvr6d" podStartSLOduration=3.717666987 podStartE2EDuration="21.224682345s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:34.988639446 +0000 UTC m=+3.408546216" lastFinishedPulling="2026-04-22 19:58:52.495654804 +0000 UTC m=+20.915561574" observedRunningTime="2026-04-22 19:58:53.22437946 +0000 UTC m=+21.644286252" watchObservedRunningTime="2026-04-22 19:58:53.224682345 +0000 UTC m=+21.644589135" Apr 22 19:58:53.239024 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.238982 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zsbfm" podStartSLOduration=3.803069447 podStartE2EDuration="21.238972143s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:34.997352685 +0000 UTC m=+3.417259454" lastFinishedPulling="2026-04-22 19:58:52.433255368 +0000 UTC m=+20.853162150" observedRunningTime="2026-04-22 19:58:53.238900132 +0000 UTC m=+21.658806925" watchObservedRunningTime="2026-04-22 19:58:53.238972143 +0000 UTC m=+21.658878933" Apr 22 19:58:53.264263 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.264221 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w5slr" podStartSLOduration=3.801277992 podStartE2EDuration="21.26420933s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:34.994757139 +0000 UTC m=+3.414663909" lastFinishedPulling="2026-04-22 19:58:52.457688463 +0000 UTC m=+20.877595247" observedRunningTime="2026-04-22 19:58:53.25261931 +0000 UTC m=+21.672526102" watchObservedRunningTime="2026-04-22 19:58:53.26420933 +0000 UTC m=+21.684116142" Apr 22 19:58:53.915963 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.915933 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:58:53.964900 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.964844 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:53.965525 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.965500 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:53.978639 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:53.978590 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gvfnk" podStartSLOduration=4.510683804 podStartE2EDuration="21.978573461s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:34.989795934 +0000 UTC m=+3.409702717" lastFinishedPulling="2026-04-22 19:58:52.457685599 +0000 UTC m=+20.877592374" observedRunningTime="2026-04-22 19:58:53.264393541 +0000 UTC m=+21.684300335" watchObservedRunningTime="2026-04-22 19:58:53.978573461 +0000 UTC m=+22.398480253" Apr 22 19:58:54.051684 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.051520 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:58:53.915959587Z","UUID":"df08684f-5f75-4c66-9988-2a2fd38ab58b","Handler":null,"Name":"","Endpoint":""} Apr 22 19:58:54.053225 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.053196 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:58:54.053225 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.053232 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:58:54.092304 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.092278 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:54.092304 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.092304 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:54.092714 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:54.092408 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:54.092714 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:54.092563 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:54.186621 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.186581 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" event={"ID":"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad","Type":"ContainerStarted","Data":"d49c88c4c1b197f1bdbadac53a53f9e5e0573ce0fc30493e8f77dfe8242ddcb3"} Apr 22 19:58:54.189710 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.189679 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"4e2aae83daff18c376e3d39d1eee9c731e3d0a737daf975dea98fd05e7a0fc9e"} Apr 22 19:58:54.189849 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.189718 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"bc5eef56837071b4474f18b30a02d69d2daface0fe7307a1c6871acc6bb9b0ae"} Apr 22 19:58:54.189849 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.189731 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"4222ae9c6541f50172cce2339502254c4340acf941aa0360f31d78193ae9caf2"} Apr 22 19:58:54.191295 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.191261 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2vqmz" event={"ID":"ec8d36c4-4235-4e79-b1b6-bf2c0e464d5f","Type":"ContainerStarted","Data":"dc746064602ca3fc5d10669c7642a38747165e8c43f2f1e68ff8e5001c04dff0"} Apr 22 19:58:54.192027 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.192008 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:54.192474 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.192455 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zsbfm" Apr 22 19:58:54.205615 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:54.205577 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2vqmz" podStartSLOduration=4.726376901 podStartE2EDuration="22.205563743s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:34.99960774 +0000 UTC m=+3.419514524" lastFinishedPulling="2026-04-22 19:58:52.478794597 +0000 UTC m=+20.898701366" observedRunningTime="2026-04-22 19:58:54.205466364 +0000 UTC m=+22.625373155" watchObservedRunningTime="2026-04-22 19:58:54.205563743 +0000 UTC m=+22.625470534" Apr 22 19:58:55.196175 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:55.195904 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" event={"ID":"80f4b313-2b5e-45df-b4a7-4e3e8651d1ad","Type":"ContainerStarted","Data":"c2feb8e724188e78a3e772430a9a0d1376d402b6fe82d20b88a98774aeaf6c0c"} Apr 22 19:58:55.214286 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:55.214232 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pv2vh" podStartSLOduration=3.4707773250000002 podStartE2EDuration="23.214214626s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:35.002404307 +0000 UTC m=+3.422311088" lastFinishedPulling="2026-04-22 19:58:54.745841615 +0000 UTC m=+23.165748389" observedRunningTime="2026-04-22 19:58:55.213788323 +0000 UTC m=+23.633695114" watchObservedRunningTime="2026-04-22 19:58:55.214214626 +0000 UTC m=+23.634121420" Apr 22 19:58:56.092355 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:56.092279 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:56.092355 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:56.092312 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:56.092599 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:56.092405 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:56.092599 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:56.092502 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:56.200824 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:56.200784 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"6d0bf8277a8254c88975a5b3911d9a63232e6cebd9588ec17b7273c50c45ae14"} Apr 22 19:58:58.092728 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.092463 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:58.093403 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.092542 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:58.093403 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:58.092764 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:58:58.093403 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:58.092813 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:58.206730 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.206686 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" event={"ID":"363c0120-0870-4e33-8e8c-f6eeb68a30f9","Type":"ContainerStarted","Data":"a4791ce0492fc382037446796e7c035c8875320774b06b2db8bca465f158e511"} Apr 22 19:58:58.207038 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.207016 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:58.207255 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.207047 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:58.208493 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.208468 2583 generic.go:358] "Generic (PLEG): container finished" podID="1be306f2-45b8-43c3-9302-dce9d9ac5650" containerID="c012e5cf20c67ea33de5d9115e513dc00101be7415f726dada9f943da7ea5a6f" exitCode=0 Apr 22 19:58:58.208591 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.208502 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" event={"ID":"1be306f2-45b8-43c3-9302-dce9d9ac5650","Type":"ContainerDied","Data":"c012e5cf20c67ea33de5d9115e513dc00101be7415f726dada9f943da7ea5a6f"} Apr 22 19:58:58.222402 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.222381 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:58.222520 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.222443 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:58:58.268356 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:58.268312 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" podStartSLOduration=8.296534443 podStartE2EDuration="26.268299227s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:35.000386505 +0000 UTC m=+3.420293292" lastFinishedPulling="2026-04-22 19:58:52.972151292 +0000 UTC m=+21.392058076" observedRunningTime="2026-04-22 19:58:58.238291555 +0000 UTC m=+26.658198336" watchObservedRunningTime="2026-04-22 19:58:58.268299227 +0000 UTC m=+26.688206018" Apr 22 19:58:59.212050 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:59.211962 2583 generic.go:358] "Generic (PLEG): container finished" podID="1be306f2-45b8-43c3-9302-dce9d9ac5650" containerID="cd49c7518a78da5e406d23ae93ca28e24a16ba21821924f6663728c2688926b6" exitCode=0 Apr 22 19:58:59.212050 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:59.212024 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" event={"ID":"1be306f2-45b8-43c3-9302-dce9d9ac5650","Type":"ContainerDied","Data":"cd49c7518a78da5e406d23ae93ca28e24a16ba21821924f6663728c2688926b6"} Apr 22 19:58:59.212429 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:59.212254 2583 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:58:59.769412 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:59.769382 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rgxkv"] Apr 22 19:58:59.769576 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:59.769498 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:58:59.769620 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:59.769582 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:58:59.772398 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:59.772373 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q6lbk"] Apr 22 19:58:59.772535 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:58:59.772465 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:58:59.772602 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:58:59.772544 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:59:00.216086 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:00.215829 2583 generic.go:358] "Generic (PLEG): container finished" podID="1be306f2-45b8-43c3-9302-dce9d9ac5650" containerID="ae2a1d11e4fc5873c910b08fcafadce47cf4fd5f93e4615b29e139d5e1840d10" exitCode=0 Apr 22 19:59:00.216086 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:00.215896 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" event={"ID":"1be306f2-45b8-43c3-9302-dce9d9ac5650","Type":"ContainerDied","Data":"ae2a1d11e4fc5873c910b08fcafadce47cf4fd5f93e4615b29e139d5e1840d10"} Apr 22 19:59:00.216477 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:00.216174 2583 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:59:01.092059 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:01.092018 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:01.092270 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:01.092021 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:59:01.092270 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:01.092162 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:59:01.092270 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:01.092242 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:59:02.461788 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:02.461754 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:59:02.462439 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:02.462099 2583 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:59:02.477091 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:02.477041 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" podUID="363c0120-0870-4e33-8e8c-f6eeb68a30f9" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 19:59:02.487503 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:02.487462 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" podUID="363c0120-0870-4e33-8e8c-f6eeb68a30f9" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 19:59:03.092198 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:03.092168 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:03.092381 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:03.092167 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:59:03.092381 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:03.092297 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:59:03.092481 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:03.092372 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:59:05.091777 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:05.091743 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:05.092389 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:05.091888 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:59:05.092389 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:05.091881 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rgxkv" podUID="428fb0f0-657f-42fe-874e-120700caf3c2" Apr 22 19:59:05.092389 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:05.091982 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q6lbk" podUID="fbf58ad5-56ae-4535-a07f-980865760128" Apr 22 19:59:05.743068 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:05.743029 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:59:05.743324 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:05.743197 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:59:05.743324 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:05.743269 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs podName:fbf58ad5-56ae-4535-a07f-980865760128 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:37.743251382 +0000 UTC m=+66.163158170 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs") pod "network-metrics-daemon-q6lbk" (UID: "fbf58ad5-56ae-4535-a07f-980865760128") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:59:05.844024 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:05.843986 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:05.844184 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:05.844123 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:59:05.844184 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:05.844138 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:59:05.844184 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:05.844146 2583 projected.go:194] Error preparing data for projected volume kube-api-access-rw7zl for pod openshift-network-diagnostics/network-check-target-rgxkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:59:05.844287 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:05.844192 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl podName:428fb0f0-657f-42fe-874e-120700caf3c2 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:37.8441793 +0000 UTC m=+66.264086069 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rw7zl" (UniqueName: "kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl") pod "network-check-target-rgxkv" (UID: "428fb0f0-657f-42fe-874e-120700caf3c2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:59:06.419759 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.419724 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-221.ec2.internal" event="NodeReady" Apr 22 19:59:06.420222 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.419938 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:59:06.471382 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.471300 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2llws"] Apr 22 19:59:06.494091 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.494043 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kwgk6"] Apr 22 19:59:06.494795 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.494721 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.498319 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.498288 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:59:06.498463 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.498323 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:59:06.498643 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.498603 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2xtr6\"" Apr 22 19:59:06.512393 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.512365 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2llws"] Apr 22 19:59:06.512492 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.512411 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kwgk6"] Apr 22 19:59:06.512492 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.512473 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:06.514755 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.514736 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:59:06.515343 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.515323 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:59:06.515443 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.515331 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g9zd7\"" Apr 22 19:59:06.515443 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.515380 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:59:06.650842 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.650810 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-tmp-dir\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.650842 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.650849 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-config-volume\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.651076 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.650896 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.651076 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.650922 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtj5m\" (UniqueName: \"kubernetes.io/projected/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-kube-api-access-rtj5m\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.651076 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.650944 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:06.651076 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.651002 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7qd\" (UniqueName: \"kubernetes.io/projected/c938b1dd-fed3-4797-aab7-2136204f1cd8-kube-api-access-wc7qd\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:06.752264 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.752185 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:06.752264 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.752242 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7qd\" (UniqueName: \"kubernetes.io/projected/c938b1dd-fed3-4797-aab7-2136204f1cd8-kube-api-access-wc7qd\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:06.752454 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.752281 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-tmp-dir\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.752454 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.752303 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-config-volume\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.752454 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.752326 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.752454 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:06.752341 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:06.752454 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.752353 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtj5m\" (UniqueName: \"kubernetes.io/projected/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-kube-api-access-rtj5m\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.752454 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:06.752427 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert podName:c938b1dd-fed3-4797-aab7-2136204f1cd8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:07.252405872 +0000 UTC m=+35.672312649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert") pod "ingress-canary-kwgk6" (UID: "c938b1dd-fed3-4797-aab7-2136204f1cd8") : secret "canary-serving-cert" not found Apr 22 19:59:06.752678 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:06.752460 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:06.752678 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:06.752498 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls podName:7b46c2ec-f7aa-4451-90a4-5e3695b9ed78 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:07.252486939 +0000 UTC m=+35.672393711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls") pod "dns-default-2llws" (UID: "7b46c2ec-f7aa-4451-90a4-5e3695b9ed78") : secret "dns-default-metrics-tls" not found Apr 22 19:59:06.752678 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.752662 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-tmp-dir\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.753012 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.752996 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-config-volume\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.764585 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.764562 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtj5m\" (UniqueName: \"kubernetes.io/projected/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-kube-api-access-rtj5m\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:06.764729 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:06.764709 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7qd\" (UniqueName: \"kubernetes.io/projected/c938b1dd-fed3-4797-aab7-2136204f1cd8-kube-api-access-wc7qd\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:07.091797 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.091702 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:59:07.091996 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.091702 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:07.094884 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.094843 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:59:07.094884 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.094877 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:59:07.094884 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.094843 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vnzkf\"" Apr 22 19:59:07.095096 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.094844 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h4q6q\"" Apr 22 19:59:07.095096 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.094847 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:59:07.231682 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.231504 2583 generic.go:358] "Generic (PLEG): container finished" podID="1be306f2-45b8-43c3-9302-dce9d9ac5650" containerID="8c306eefa30ce4e1725fd6178dcde3bbadec45f019c31a521dd29261c00ff272" exitCode=0 Apr 22 19:59:07.231682 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.231577 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" event={"ID":"1be306f2-45b8-43c3-9302-dce9d9ac5650","Type":"ContainerDied","Data":"8c306eefa30ce4e1725fd6178dcde3bbadec45f019c31a521dd29261c00ff272"} Apr 22 19:59:07.256209 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.256185 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:07.256352 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:07.256227 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:07.256352 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:07.256309 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:07.256460 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:07.256364 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls podName:7b46c2ec-f7aa-4451-90a4-5e3695b9ed78 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:08.25634824 +0000 UTC m=+36.676255014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls") pod "dns-default-2llws" (UID: "7b46c2ec-f7aa-4451-90a4-5e3695b9ed78") : secret "dns-default-metrics-tls" not found Apr 22 19:59:07.256460 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:07.256364 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:07.256460 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:07.256420 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert podName:c938b1dd-fed3-4797-aab7-2136204f1cd8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:08.256404793 +0000 UTC m=+36.676311577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert") pod "ingress-canary-kwgk6" (UID: "c938b1dd-fed3-4797-aab7-2136204f1cd8") : secret "canary-serving-cert" not found Apr 22 19:59:08.236239 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:08.236207 2583 generic.go:358] "Generic (PLEG): container finished" podID="1be306f2-45b8-43c3-9302-dce9d9ac5650" containerID="09503a126b1dc64af038ba8e2e0c9039969b3aed4eac06567fc0c67508f2ab05" exitCode=0 Apr 22 19:59:08.236239 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:08.236251 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" event={"ID":"1be306f2-45b8-43c3-9302-dce9d9ac5650","Type":"ContainerDied","Data":"09503a126b1dc64af038ba8e2e0c9039969b3aed4eac06567fc0c67508f2ab05"} Apr 22 19:59:08.264792 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:08.264762 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:08.264919 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:08.264822 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:08.264967 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:08.264917 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:08.264967 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:08.264961 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls podName:7b46c2ec-f7aa-4451-90a4-5e3695b9ed78 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:10.264948589 +0000 UTC m=+38.684855359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls") pod "dns-default-2llws" (UID: "7b46c2ec-f7aa-4451-90a4-5e3695b9ed78") : secret "dns-default-metrics-tls" not found Apr 22 19:59:08.265036 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:08.264916 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:08.265068 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:08.265047 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert podName:c938b1dd-fed3-4797-aab7-2136204f1cd8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:10.265028141 +0000 UTC m=+38.684934914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert") pod "ingress-canary-kwgk6" (UID: "c938b1dd-fed3-4797-aab7-2136204f1cd8") : secret "canary-serving-cert" not found Apr 22 19:59:09.240635 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:09.240604 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" event={"ID":"1be306f2-45b8-43c3-9302-dce9d9ac5650","Type":"ContainerStarted","Data":"d578be0ea6ca61c74eca45b01f7b495cda4950de6382d3613e57760f229edf74"} Apr 22 19:59:10.278678 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:10.278645 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:10.278678 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:10.278684 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:10.279246 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:10.278785 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:10.279246 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:10.278818 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:10.279246 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:10.278838 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert podName:c938b1dd-fed3-4797-aab7-2136204f1cd8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:14.278822396 +0000 UTC m=+42.698729165 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert") pod "ingress-canary-kwgk6" (UID: "c938b1dd-fed3-4797-aab7-2136204f1cd8") : secret "canary-serving-cert" not found Apr 22 19:59:10.279246 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:10.278908 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls podName:7b46c2ec-f7aa-4451-90a4-5e3695b9ed78 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:14.278890627 +0000 UTC m=+42.698797419 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls") pod "dns-default-2llws" (UID: "7b46c2ec-f7aa-4451-90a4-5e3695b9ed78") : secret "dns-default-metrics-tls" not found Apr 22 19:59:14.304228 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:14.304188 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:14.304228 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:14.304230 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:14.304675 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:14.304317 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:14.304675 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:14.304332 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:14.304675 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:14.304380 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert podName:c938b1dd-fed3-4797-aab7-2136204f1cd8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:22.304356023 +0000 UTC m=+50.724262807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert") pod "ingress-canary-kwgk6" (UID: "c938b1dd-fed3-4797-aab7-2136204f1cd8") : secret "canary-serving-cert" not found Apr 22 19:59:14.304675 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:14.304404 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls podName:7b46c2ec-f7aa-4451-90a4-5e3695b9ed78 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:22.304397321 +0000 UTC m=+50.724304090 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls") pod "dns-default-2llws" (UID: "7b46c2ec-f7aa-4451-90a4-5e3695b9ed78") : secret "dns-default-metrics-tls" not found Apr 22 19:59:22.364759 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:22.364726 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:22.364759 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:22.364771 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:22.365288 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:22.364885 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:22.365288 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:22.364887 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:22.365288 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:22.364951 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls podName:7b46c2ec-f7aa-4451-90a4-5e3695b9ed78 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:38.364935697 +0000 UTC m=+66.784842470 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls") pod "dns-default-2llws" (UID: "7b46c2ec-f7aa-4451-90a4-5e3695b9ed78") : secret "dns-default-metrics-tls" not found Apr 22 19:59:22.365288 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:22.364965 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert podName:c938b1dd-fed3-4797-aab7-2136204f1cd8 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:38.364958682 +0000 UTC m=+66.784865451 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert") pod "ingress-canary-kwgk6" (UID: "c938b1dd-fed3-4797-aab7-2136204f1cd8") : secret "canary-serving-cert" not found Apr 22 19:59:32.487015 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:32.486974 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ws7ww" Apr 22 19:59:32.516461 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:32.516407 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x9bbm" podStartSLOduration=29.290329504 podStartE2EDuration="1m0.516391278s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:58:34.997144189 +0000 UTC m=+3.417050957" lastFinishedPulling="2026-04-22 19:59:06.223205962 +0000 UTC m=+34.643112731" observedRunningTime="2026-04-22 19:59:09.270033956 +0000 UTC m=+37.689940747" watchObservedRunningTime="2026-04-22 19:59:32.516391278 +0000 UTC m=+60.936298069" Apr 22 19:59:37.771147 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:37.771107 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 19:59:37.773628 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:37.773608 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:59:37.782207 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:37.782183 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:59:37.782297 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:37.782242 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs podName:fbf58ad5-56ae-4535-a07f-980865760128 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:41.782225874 +0000 UTC m=+130.202132651 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs") pod "network-metrics-daemon-q6lbk" (UID: "fbf58ad5-56ae-4535-a07f-980865760128") : secret "metrics-daemon-secret" not found Apr 22 19:59:37.871839 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:37.871811 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:37.874333 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:37.874318 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:59:37.884472 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:37.884456 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:59:37.896075 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:37.896048 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw7zl\" (UniqueName: \"kubernetes.io/projected/428fb0f0-657f-42fe-874e-120700caf3c2-kube-api-access-rw7zl\") pod \"network-check-target-rgxkv\" (UID: \"428fb0f0-657f-42fe-874e-120700caf3c2\") " pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:38.008764 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:38.008738 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h4q6q\"" Apr 22 19:59:38.017529 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:38.017507 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:38.198479 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:38.198450 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rgxkv"] Apr 22 19:59:38.202312 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:59:38.202285 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428fb0f0_657f_42fe_874e_120700caf3c2.slice/crio-1ec93a2a6bd09986ba31fb6981a338913fb6081a9d1e6de8c89850327b532878 WatchSource:0}: Error finding container 1ec93a2a6bd09986ba31fb6981a338913fb6081a9d1e6de8c89850327b532878: Status 404 returned error can't find the container with id 1ec93a2a6bd09986ba31fb6981a338913fb6081a9d1e6de8c89850327b532878 Apr 22 19:59:38.293391 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:38.293311 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rgxkv" event={"ID":"428fb0f0-657f-42fe-874e-120700caf3c2","Type":"ContainerStarted","Data":"1ec93a2a6bd09986ba31fb6981a338913fb6081a9d1e6de8c89850327b532878"} Apr 22 19:59:38.375238 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:38.375208 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 19:59:38.375392 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:38.375253 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 19:59:38.375392 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:38.375357 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:38.375392 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:38.375375 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:38.375486 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:38.375420 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls podName:7b46c2ec-f7aa-4451-90a4-5e3695b9ed78 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:10.375403353 +0000 UTC m=+98.795310128 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls") pod "dns-default-2llws" (UID: "7b46c2ec-f7aa-4451-90a4-5e3695b9ed78") : secret "dns-default-metrics-tls" not found Apr 22 19:59:38.375486 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:38.375443 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert podName:c938b1dd-fed3-4797-aab7-2136204f1cd8 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:10.375430916 +0000 UTC m=+98.795337685 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert") pod "ingress-canary-kwgk6" (UID: "c938b1dd-fed3-4797-aab7-2136204f1cd8") : secret "canary-serving-cert" not found Apr 22 19:59:39.160155 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.160099 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv"] Apr 22 19:59:39.163594 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.163570 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l"] Apr 22 19:59:39.163733 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.163708 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.166268 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.166243 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 19:59:39.166398 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.166364 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 19:59:39.166468 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.166402 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 19:59:39.166468 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.166422 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:59:39.166582 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.166422 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-b42mz\"" Apr 22 19:59:39.166804 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.166782 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l" Apr 22 19:59:39.168193 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.168174 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-br2nv"] Apr 22 19:59:39.168802 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.168780 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:59:39.169041 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.169021 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-jtpwb\"" Apr 22 19:59:39.169137 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.169093 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:59:39.171223 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.171201 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.176175 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.174584 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:59:39.176175 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.174650 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:59:39.176175 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.174936 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2x2j6\"" Apr 22 19:59:39.176175 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.175057 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l"] Apr 22 19:59:39.176175 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.175330 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:59:39.176175 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.175505 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv"] Apr 22 19:59:39.176808 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.176790 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:59:39.180542 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180519 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/93becda5-df4b-41f0-954c-ed611504c70c-snapshots\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.180640 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180584 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93becda5-df4b-41f0-954c-ed611504c70c-service-ca-bundle\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.180640 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180617 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408c985c-a3f6-453e-9cd5-3acc576e7673-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.180761 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180644 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/408c985c-a3f6-453e-9cd5-3acc576e7673-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.180761 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180682 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93becda5-df4b-41f0-954c-ed611504c70c-tmp\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.180761 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180725 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93becda5-df4b-41f0-954c-ed611504c70c-serving-cert\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.180761 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180752 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93becda5-df4b-41f0-954c-ed611504c70c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.180982 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180923 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrwpm\" (UniqueName: \"kubernetes.io/projected/93becda5-df4b-41f0-954c-ed611504c70c-kube-api-access-mrwpm\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.181035 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.180987 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfz9t\" (UniqueName: \"kubernetes.io/projected/47e5d131-56cf-49a5-bc27-d784bcab468a-kube-api-access-gfz9t\") pod \"volume-data-source-validator-7c6cbb6c87-htt2l\" (UID: \"47e5d131-56cf-49a5-bc27-d784bcab468a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l" Apr 22 19:59:39.181113 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.181087 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p8b4\" (UniqueName: \"kubernetes.io/projected/408c985c-a3f6-453e-9cd5-3acc576e7673-kube-api-access-4p8b4\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.182310 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.181735 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:59:39.187384 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.187363 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-br2nv"] Apr 22 19:59:39.267201 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.267168 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c"] Apr 22 19:59:39.270616 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.270592 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-d7dc7d6db-nq4vr"] Apr 22 19:59:39.270814 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.270745 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.273050 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.272890 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-6w7gb\"" Apr 22 19:59:39.273050 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.272902 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:59:39.273050 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.272954 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 19:59:39.273050 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.272903 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 19:59:39.273317 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.273185 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 19:59:39.273840 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.273817 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg"] Apr 22 19:59:39.273996 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.273979 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.275779 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.275758 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:59:39.275779 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.275776 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:59:39.276089 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.276074 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-4mv7w\"" Apr 22 19:59:39.276167 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.276091 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:59:39.276341 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.276325 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:59:39.276399 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.276380 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:59:39.276550 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.276533 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:59:39.276882 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.276829 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:39.279809 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.279667 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:59:39.279809 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.279712 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:59:39.279809 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.279764 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-nprh4\"" Apr 22 19:59:39.280263 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.280240 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:59:39.280789 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.280767 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c"] Apr 22 19:59:39.281930 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.281856 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg"] Apr 22 19:59:39.282070 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.281948 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-config\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.282070 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.281990 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93becda5-df4b-41f0-954c-ed611504c70c-serving-cert\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.282070 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282021 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93becda5-df4b-41f0-954c-ed611504c70c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.282247 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282045 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrwpm\" (UniqueName: \"kubernetes.io/projected/93becda5-df4b-41f0-954c-ed611504c70c-kube-api-access-mrwpm\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.282247 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282159 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfz9t\" (UniqueName: \"kubernetes.io/projected/47e5d131-56cf-49a5-bc27-d784bcab468a-kube-api-access-gfz9t\") pod \"volume-data-source-validator-7c6cbb6c87-htt2l\" (UID: \"47e5d131-56cf-49a5-bc27-d784bcab468a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l" Apr 22 19:59:39.282247 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282181 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p8b4\" (UniqueName: \"kubernetes.io/projected/408c985c-a3f6-453e-9cd5-3acc576e7673-kube-api-access-4p8b4\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.282413 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282259 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/93becda5-df4b-41f0-954c-ed611504c70c-snapshots\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.282413 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282276 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf62t\" (UniqueName: \"kubernetes.io/projected/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-kube-api-access-sf62t\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.282413 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282306 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.282413 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282324 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93becda5-df4b-41f0-954c-ed611504c70c-service-ca-bundle\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.282413 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282343 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408c985c-a3f6-453e-9cd5-3acc576e7673-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.282413 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282360 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/408c985c-a3f6-453e-9cd5-3acc576e7673-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.282413 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282385 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93becda5-df4b-41f0-954c-ed611504c70c-tmp\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.282745 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282661 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d7dc7d6db-nq4vr"] Apr 22 19:59:39.282745 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.282715 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93becda5-df4b-41f0-954c-ed611504c70c-tmp\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.283482 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.283441 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/93becda5-df4b-41f0-954c-ed611504c70c-snapshots\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.283579 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.283560 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/408c985c-a3f6-453e-9cd5-3acc576e7673-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.283646 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.283601 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93becda5-df4b-41f0-954c-ed611504c70c-service-ca-bundle\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.283646 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.283635 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93becda5-df4b-41f0-954c-ed611504c70c-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.285714 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.285692 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93becda5-df4b-41f0-954c-ed611504c70c-serving-cert\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.286500 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.286477 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/408c985c-a3f6-453e-9cd5-3acc576e7673-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.296948 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.296923 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrwpm\" (UniqueName: \"kubernetes.io/projected/93becda5-df4b-41f0-954c-ed611504c70c-kube-api-access-mrwpm\") pod \"insights-operator-585dfdc468-br2nv\" (UID: \"93becda5-df4b-41f0-954c-ed611504c70c\") " pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.297073 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.296963 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p8b4\" (UniqueName: \"kubernetes.io/projected/408c985c-a3f6-453e-9cd5-3acc576e7673-kube-api-access-4p8b4\") pod \"kube-storage-version-migrator-operator-6769c5d45-vwfsv\" (UID: \"408c985c-a3f6-453e-9cd5-3acc576e7673\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.297073 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.297055 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfz9t\" (UniqueName: \"kubernetes.io/projected/47e5d131-56cf-49a5-bc27-d784bcab468a-kube-api-access-gfz9t\") pod \"volume-data-source-validator-7c6cbb6c87-htt2l\" (UID: \"47e5d131-56cf-49a5-bc27-d784bcab468a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l" Apr 22 19:59:39.383225 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383182 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sf62t\" (UniqueName: \"kubernetes.io/projected/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-kube-api-access-sf62t\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.383416 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383250 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.383416 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383298 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vwjl\" (UniqueName: \"kubernetes.io/projected/24b5efed-ced4-4697-80ee-7021d3b4c69b-kube-api-access-5vwjl\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:39.383416 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383340 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-config\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.383416 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383363 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-default-certificate\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.383416 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383389 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.383652 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-stats-auth\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.383652 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383472 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:39.383652 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383502 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.383652 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.383519 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jk8z\" (UniqueName: \"kubernetes.io/projected/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-kube-api-access-5jk8z\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.384049 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.384020 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-config\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.386204 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.386177 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.390925 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.390901 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf62t\" (UniqueName: \"kubernetes.io/projected/32c9ae24-cac4-4927-a2d1-a0542ac9e54d-kube-api-access-sf62t\") pod \"service-ca-operator-d6fc45fc5-csj2c\" (UID: \"32c9ae24-cac4-4927-a2d1-a0542ac9e54d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.478779 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.478738 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" Apr 22 19:59:39.484814 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.484776 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:39.484956 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.484821 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.484956 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.484839 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jk8z\" (UniqueName: \"kubernetes.io/projected/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-kube-api-access-5jk8z\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.485067 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.484953 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:39.485067 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.484983 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vwjl\" (UniqueName: \"kubernetes.io/projected/24b5efed-ced4-4697-80ee-7021d3b4c69b-kube-api-access-5vwjl\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:39.485067 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.485018 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:39.984999866 +0000 UTC m=+68.404906660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : secret "router-metrics-certs-default" not found Apr 22 19:59:39.485067 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.484954 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:59:39.485067 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.485064 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-default-certificate\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.485318 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.485089 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.485318 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.485119 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls podName:24b5efed-ced4-4697-80ee-7021d3b4c69b nodeName:}" failed. No retries permitted until 2026-04-22 19:59:39.985098771 +0000 UTC m=+68.405005554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zcjkg" (UID: "24b5efed-ced4-4697-80ee-7021d3b4c69b") : secret "samples-operator-tls" not found Apr 22 19:59:39.485318 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.485165 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-stats-auth\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.485318 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.485221 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:39.985208954 +0000 UTC m=+68.405115727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:39.486738 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.486714 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l" Apr 22 19:59:39.488235 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.488126 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-stats-auth\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.488235 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.488187 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-default-certificate\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.493449 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.493205 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-br2nv" Apr 22 19:59:39.494033 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.494012 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jk8z\" (UniqueName: \"kubernetes.io/projected/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-kube-api-access-5jk8z\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.494794 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.494772 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vwjl\" (UniqueName: \"kubernetes.io/projected/24b5efed-ced4-4697-80ee-7021d3b4c69b-kube-api-access-5vwjl\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:39.583716 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.583290 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" Apr 22 19:59:39.660727 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.660544 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv"] Apr 22 19:59:39.663804 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:59:39.663776 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408c985c_a3f6_453e_9cd5_3acc576e7673.slice/crio-12403fcdded0754ee8460f39b6616749938651cdc49bd4bc4575b6e6ab617dbe WatchSource:0}: Error finding container 12403fcdded0754ee8460f39b6616749938651cdc49bd4bc4575b6e6ab617dbe: Status 404 returned error can't find the container with id 12403fcdded0754ee8460f39b6616749938651cdc49bd4bc4575b6e6ab617dbe Apr 22 19:59:39.731598 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.731516 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c"] Apr 22 19:59:39.734670 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:59:39.734635 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c9ae24_cac4_4927_a2d1_a0542ac9e54d.slice/crio-2f03a3a0d92f4e150fea5fcbdaeb320666b451238a76feb3fcaaa7b426b91d44 WatchSource:0}: Error finding container 2f03a3a0d92f4e150fea5fcbdaeb320666b451238a76feb3fcaaa7b426b91d44: Status 404 returned error can't find the container with id 2f03a3a0d92f4e150fea5fcbdaeb320666b451238a76feb3fcaaa7b426b91d44 Apr 22 19:59:39.877581 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.877549 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l"] Apr 22 19:59:39.880833 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.880804 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-br2nv"] Apr 22 19:59:39.880991 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:59:39.880933 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47e5d131_56cf_49a5_bc27_d784bcab468a.slice/crio-c588159e3044be6431c5dd5727670feaf242cb9b0cdae87117a202ffd5a54c52 WatchSource:0}: Error finding container c588159e3044be6431c5dd5727670feaf242cb9b0cdae87117a202ffd5a54c52: Status 404 returned error can't find the container with id c588159e3044be6431c5dd5727670feaf242cb9b0cdae87117a202ffd5a54c52 Apr 22 19:59:39.884029 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:59:39.884003 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93becda5_df4b_41f0_954c_ed611504c70c.slice/crio-95114f94f231b755c68042ba5f274553dcb67db5b0b115b84db36e9f7c195f8d WatchSource:0}: Error finding container 95114f94f231b755c68042ba5f274553dcb67db5b0b115b84db36e9f7c195f8d: Status 404 returned error can't find the container with id 95114f94f231b755c68042ba5f274553dcb67db5b0b115b84db36e9f7c195f8d Apr 22 19:59:39.990815 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.990744 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.990815 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.990795 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:39.991022 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.990894 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:59:39.991022 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.990917 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:40.990898444 +0000 UTC m=+69.410805217 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:39.991022 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.990949 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls podName:24b5efed-ced4-4697-80ee-7021d3b4c69b nodeName:}" failed. No retries permitted until 2026-04-22 19:59:40.990929703 +0000 UTC m=+69.410836471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zcjkg" (UID: "24b5efed-ced4-4697-80ee-7021d3b4c69b") : secret "samples-operator-tls" not found Apr 22 19:59:39.991022 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:39.990962 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:39.991192 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.991045 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:39.991192 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:39.991071 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:40.991064856 +0000 UTC m=+69.410971624 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : secret "router-metrics-certs-default" not found Apr 22 19:59:40.299039 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:40.298956 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" event={"ID":"32c9ae24-cac4-4927-a2d1-a0542ac9e54d","Type":"ContainerStarted","Data":"2f03a3a0d92f4e150fea5fcbdaeb320666b451238a76feb3fcaaa7b426b91d44"} Apr 22 19:59:40.300241 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:40.300204 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l" event={"ID":"47e5d131-56cf-49a5-bc27-d784bcab468a","Type":"ContainerStarted","Data":"c588159e3044be6431c5dd5727670feaf242cb9b0cdae87117a202ffd5a54c52"} Apr 22 19:59:40.301339 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:40.301315 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-br2nv" event={"ID":"93becda5-df4b-41f0-954c-ed611504c70c","Type":"ContainerStarted","Data":"95114f94f231b755c68042ba5f274553dcb67db5b0b115b84db36e9f7c195f8d"} Apr 22 19:59:40.302464 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:40.302422 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" event={"ID":"408c985c-a3f6-453e-9cd5-3acc576e7673","Type":"ContainerStarted","Data":"12403fcdded0754ee8460f39b6616749938651cdc49bd4bc4575b6e6ab617dbe"} Apr 22 19:59:40.999973 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:40.999938 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:41.000167 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:41.000032 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:41.000167 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:41.000073 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:41.000167 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:41.000116 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:41.000319 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:41.000167 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:59:41.000319 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:41.000181 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:43.000165554 +0000 UTC m=+71.420072344 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : secret "router-metrics-certs-default" not found Apr 22 19:59:41.000319 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:41.000205 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:43.000188205 +0000 UTC m=+71.420094998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:41.000319 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:41.000226 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls podName:24b5efed-ced4-4697-80ee-7021d3b4c69b nodeName:}" failed. No retries permitted until 2026-04-22 19:59:43.000218207 +0000 UTC m=+71.420124975 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zcjkg" (UID: "24b5efed-ced4-4697-80ee-7021d3b4c69b") : secret "samples-operator-tls" not found Apr 22 19:59:41.307217 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:41.307133 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rgxkv" event={"ID":"428fb0f0-657f-42fe-874e-120700caf3c2","Type":"ContainerStarted","Data":"d687099e3f3c2d356a2dce00dd7b74679d36efff433898daf99c3d7e5b11491b"} Apr 22 19:59:41.307976 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:41.307932 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 19:59:41.323021 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:41.322462 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rgxkv" podStartSLOduration=66.460822334 podStartE2EDuration="1m9.322444906s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:59:38.204714298 +0000 UTC m=+66.624621071" lastFinishedPulling="2026-04-22 19:59:41.066336872 +0000 UTC m=+69.486243643" observedRunningTime="2026-04-22 19:59:41.32168913 +0000 UTC m=+69.741595924" watchObservedRunningTime="2026-04-22 19:59:41.322444906 +0000 UTC m=+69.742351699" Apr 22 19:59:43.018499 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:43.018455 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:43.018973 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:43.018538 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:43.018973 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:43.018571 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:43.018973 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:43.018624 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:47.01860043 +0000 UTC m=+75.438507217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:43.018973 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:43.018696 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:43.018973 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:43.018707 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:59:43.018973 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:43.018748 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:47.018730914 +0000 UTC m=+75.438637701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : secret "router-metrics-certs-default" not found Apr 22 19:59:43.018973 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:43.018770 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls podName:24b5efed-ced4-4697-80ee-7021d3b4c69b nodeName:}" failed. No retries permitted until 2026-04-22 19:59:47.01875511 +0000 UTC m=+75.438661892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zcjkg" (UID: "24b5efed-ced4-4697-80ee-7021d3b4c69b") : secret "samples-operator-tls" not found Apr 22 19:59:44.315449 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:44.315394 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" event={"ID":"408c985c-a3f6-453e-9cd5-3acc576e7673","Type":"ContainerStarted","Data":"8ac0eed911d116ce098d6b3ba37b87da78851721217285a93026d7b359f89bd8"} Apr 22 19:59:44.316960 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:44.316930 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" event={"ID":"32c9ae24-cac4-4927-a2d1-a0542ac9e54d","Type":"ContainerStarted","Data":"a0163fd8b76649ef59a757122a48d1db4412212bcab1df8b346e6fc4f4858e1b"} Apr 22 19:59:44.318497 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:44.318435 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l" event={"ID":"47e5d131-56cf-49a5-bc27-d784bcab468a","Type":"ContainerStarted","Data":"60e355ef1a0eb97e87f44bfdb0718afa0fec6fe9c93421d068fea13205dbefae"} Apr 22 19:59:44.319953 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:44.319930 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-br2nv" event={"ID":"93becda5-df4b-41f0-954c-ed611504c70c","Type":"ContainerStarted","Data":"9ecef90c4b5ea802e9acd0ef087be3a9b28e1adf4df4a486567817b6719c3c5b"} Apr 22 19:59:44.332597 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:44.332552 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" podStartSLOduration=1.1722418079999999 podStartE2EDuration="5.332540348s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="2026-04-22 19:59:39.666507812 +0000 UTC m=+68.086414590" lastFinishedPulling="2026-04-22 19:59:43.826806362 +0000 UTC m=+72.246713130" observedRunningTime="2026-04-22 19:59:44.331851602 +0000 UTC m=+72.751758394" watchObservedRunningTime="2026-04-22 19:59:44.332540348 +0000 UTC m=+72.752447139" Apr 22 19:59:44.345785 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:44.345666 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" podStartSLOduration=1.25819744 podStartE2EDuration="5.345649359s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="2026-04-22 19:59:39.736683629 +0000 UTC m=+68.156590398" lastFinishedPulling="2026-04-22 19:59:43.824135533 +0000 UTC m=+72.244042317" observedRunningTime="2026-04-22 19:59:44.344652881 +0000 UTC m=+72.764559673" watchObservedRunningTime="2026-04-22 19:59:44.345649359 +0000 UTC m=+72.765556151" Apr 22 19:59:44.360329 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:44.360142 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-br2nv" podStartSLOduration=1.416823432 podStartE2EDuration="5.360123738s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="2026-04-22 19:59:39.886048562 +0000 UTC m=+68.305955332" lastFinishedPulling="2026-04-22 19:59:43.829348856 +0000 UTC m=+72.249255638" observedRunningTime="2026-04-22 19:59:44.358956385 +0000 UTC m=+72.778863177" watchObservedRunningTime="2026-04-22 19:59:44.360123738 +0000 UTC m=+72.780030525" Apr 22 19:59:44.380407 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:44.380355 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-htt2l" podStartSLOduration=1.439958058 podStartE2EDuration="5.380337015s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="2026-04-22 19:59:39.883121543 +0000 UTC m=+68.303028321" lastFinishedPulling="2026-04-22 19:59:43.823500507 +0000 UTC m=+72.243407278" observedRunningTime="2026-04-22 19:59:44.379270673 +0000 UTC m=+72.799177474" watchObservedRunningTime="2026-04-22 19:59:44.380337015 +0000 UTC m=+72.800243806" Apr 22 19:59:45.281519 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.281481 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6"] Apr 22 19:59:45.285687 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.285670 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" Apr 22 19:59:45.288040 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.288012 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 19:59:45.288294 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.288279 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 19:59:45.289038 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.289024 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-6dsk8\"" Apr 22 19:59:45.301382 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.301355 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6"] Apr 22 19:59:45.341513 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.341484 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2zcw\" (UniqueName: \"kubernetes.io/projected/dc7d4316-b578-4140-b514-1a54cebef99f-kube-api-access-r2zcw\") pod \"migrator-74bb7799d9-jvsd6\" (UID: \"dc7d4316-b578-4140-b514-1a54cebef99f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" Apr 22 19:59:45.442171 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.442131 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2zcw\" (UniqueName: \"kubernetes.io/projected/dc7d4316-b578-4140-b514-1a54cebef99f-kube-api-access-r2zcw\") pod \"migrator-74bb7799d9-jvsd6\" (UID: \"dc7d4316-b578-4140-b514-1a54cebef99f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" Apr 22 19:59:45.449842 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.449790 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2zcw\" (UniqueName: \"kubernetes.io/projected/dc7d4316-b578-4140-b514-1a54cebef99f-kube-api-access-r2zcw\") pod \"migrator-74bb7799d9-jvsd6\" (UID: \"dc7d4316-b578-4140-b514-1a54cebef99f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" Apr 22 19:59:45.594480 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.594383 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" Apr 22 19:59:45.715364 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:45.715329 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6"] Apr 22 19:59:45.718616 ip-10-0-135-221 kubenswrapper[2583]: W0422 19:59:45.718583 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc7d4316_b578_4140_b514_1a54cebef99f.slice/crio-6519e68eca35123557c16e271e6287c75ac15aece92a7e83691a478df985e8aa WatchSource:0}: Error finding container 6519e68eca35123557c16e271e6287c75ac15aece92a7e83691a478df985e8aa: Status 404 returned error can't find the container with id 6519e68eca35123557c16e271e6287c75ac15aece92a7e83691a478df985e8aa Apr 22 19:59:46.325268 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:46.325233 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" event={"ID":"dc7d4316-b578-4140-b514-1a54cebef99f","Type":"ContainerStarted","Data":"6519e68eca35123557c16e271e6287c75ac15aece92a7e83691a478df985e8aa"} Apr 22 19:59:47.056328 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:47.056301 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:47.056646 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:47.056347 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:47.056646 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:47.056368 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:47.056646 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:47.056454 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:59:47.056646 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:47.056468 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:55.056451484 +0000 UTC m=+83.476358257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:47.056646 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:47.056477 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:47.056646 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:47.056504 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls podName:24b5efed-ced4-4697-80ee-7021d3b4c69b nodeName:}" failed. No retries permitted until 2026-04-22 19:59:55.056491476 +0000 UTC m=+83.476398248 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-zcjkg" (UID: "24b5efed-ced4-4697-80ee-7021d3b4c69b") : secret "samples-operator-tls" not found Apr 22 19:59:47.056646 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:47.056518 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:55.056511652 +0000 UTC m=+83.476418421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : secret "router-metrics-certs-default" not found Apr 22 19:59:47.330169 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:47.330136 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" event={"ID":"dc7d4316-b578-4140-b514-1a54cebef99f","Type":"ContainerStarted","Data":"8a7de76bab0721dd5df4d74828ccd26fdf84ba61c11862f7e39e0d3dfc4cb334"} Apr 22 19:59:47.330169 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:47.330172 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" event={"ID":"dc7d4316-b578-4140-b514-1a54cebef99f","Type":"ContainerStarted","Data":"1636116a8994d07a58aca9277a9ed122e65d817e067a096a750677ca7342b08f"} Apr 22 19:59:47.349546 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:47.349499 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jvsd6" podStartSLOduration=1.164613635 podStartE2EDuration="2.349486498s" podCreationTimestamp="2026-04-22 19:59:45 +0000 UTC" firstStartedPulling="2026-04-22 19:59:45.720701608 +0000 UTC m=+74.140608376" lastFinishedPulling="2026-04-22 19:59:46.905574468 +0000 UTC m=+75.325481239" observedRunningTime="2026-04-22 19:59:47.348425775 +0000 UTC m=+75.768332565" watchObservedRunningTime="2026-04-22 19:59:47.349486498 +0000 UTC m=+75.769393288" Apr 22 19:59:48.847147 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:48.847118 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w5slr_0cb9503d-e2e9-4f70-97aa-e8fa372598fc/dns-node-resolver/0.log" Apr 22 19:59:49.447229 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:49.447201 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gvfnk_01431c0e-d992-47b0-b2db-613b46bfb3ba/node-ca/0.log" Apr 22 19:59:50.848114 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:50.847973 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jvsd6_dc7d4316-b578-4140-b514-1a54cebef99f/migrator/0.log" Apr 22 19:59:51.052581 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:51.052554 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jvsd6_dc7d4316-b578-4140-b514-1a54cebef99f/graceful-termination/0.log" Apr 22 19:59:51.249636 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:51.249605 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-vwfsv_408c985c-a3f6-453e-9cd5-3acc576e7673/kube-storage-version-migrator-operator/0.log" Apr 22 19:59:55.126587 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:55.126547 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:55.127004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:55.126607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:55.127004 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:55.126729 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 19:59:55.127004 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:55.126737 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:11.126715472 +0000 UTC m=+99.546622261 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : configmap references non-existent config key: service-ca.crt Apr 22 19:59:55.127004 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:55.126849 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:59:55.127004 ip-10-0-135-221 kubenswrapper[2583]: E0422 19:59:55.126924 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs podName:a7cb3649-fe15-4eeb-a0e1-e3b600a54358 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:11.12691232 +0000 UTC m=+99.546819092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs") pod "router-default-d7dc7d6db-nq4vr" (UID: "a7cb3649-fe15-4eeb-a0e1-e3b600a54358") : secret "router-metrics-certs-default" not found Apr 22 19:59:55.129140 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:55.129122 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/24b5efed-ced4-4697-80ee-7021d3b4c69b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-zcjkg\" (UID: \"24b5efed-ced4-4697-80ee-7021d3b4c69b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:55.214682 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:55.214652 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" Apr 22 19:59:55.328232 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:55.328199 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg"] Apr 22 19:59:56.357629 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:56.357583 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" event={"ID":"24b5efed-ced4-4697-80ee-7021d3b4c69b","Type":"ContainerStarted","Data":"fe5cdf91f5ab5689a5686fe8d9fdd516a1fce3d268b70c9f1c2a0eccaab2f254"} Apr 22 19:59:57.361780 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:57.361697 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" event={"ID":"24b5efed-ced4-4697-80ee-7021d3b4c69b","Type":"ContainerStarted","Data":"8dff158f25de1972b52f491361e58277562eac575ee565aec14df24af239f94e"} Apr 22 19:59:57.361780 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:57.361736 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" event={"ID":"24b5efed-ced4-4697-80ee-7021d3b4c69b","Type":"ContainerStarted","Data":"4051f68fea12b8b6ad4d3d5a7683465a2d49c072c31f8b90355e87787d60a7ef"} Apr 22 19:59:57.376078 ip-10-0-135-221 kubenswrapper[2583]: I0422 19:59:57.376033 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-zcjkg" podStartSLOduration=16.645566788 podStartE2EDuration="18.376020082s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="2026-04-22 19:59:55.374759232 +0000 UTC m=+83.794666001" lastFinishedPulling="2026-04-22 19:59:57.105212526 +0000 UTC m=+85.525119295" observedRunningTime="2026-04-22 19:59:57.375265774 +0000 UTC m=+85.795172566" watchObservedRunningTime="2026-04-22 19:59:57.376020082 +0000 UTC m=+85.795926873" Apr 22 20:00:08.568741 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.568707 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8wd72"] Apr 22 20:00:08.572979 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.572951 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.577064 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.577045 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rfw8m\"" Apr 22 20:00:08.577677 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.577662 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 20:00:08.582092 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.582076 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 20:00:08.588071 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.588051 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8wd72"] Apr 22 20:00:08.735102 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.735066 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e0754b2-cef4-4b51-b452-93abecb53041-data-volume\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.735102 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.735109 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9e0754b2-cef4-4b51-b452-93abecb53041-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.735324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.735144 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd68j\" (UniqueName: \"kubernetes.io/projected/9e0754b2-cef4-4b51-b452-93abecb53041-kube-api-access-fd68j\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.735324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.735218 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9e0754b2-cef4-4b51-b452-93abecb53041-crio-socket\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.735324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.735245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9e0754b2-cef4-4b51-b452-93abecb53041-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.835602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.835528 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd68j\" (UniqueName: \"kubernetes.io/projected/9e0754b2-cef4-4b51-b452-93abecb53041-kube-api-access-fd68j\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.835602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.835568 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9e0754b2-cef4-4b51-b452-93abecb53041-crio-socket\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.835602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.835586 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9e0754b2-cef4-4b51-b452-93abecb53041-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.835839 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.835662 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9e0754b2-cef4-4b51-b452-93abecb53041-crio-socket\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.835839 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.835669 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e0754b2-cef4-4b51-b452-93abecb53041-data-volume\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.835839 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.835717 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9e0754b2-cef4-4b51-b452-93abecb53041-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.836062 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.836041 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e0754b2-cef4-4b51-b452-93abecb53041-data-volume\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.836282 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.836260 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9e0754b2-cef4-4b51-b452-93abecb53041-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.838103 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.838086 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9e0754b2-cef4-4b51-b452-93abecb53041-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.843361 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.843337 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd68j\" (UniqueName: \"kubernetes.io/projected/9e0754b2-cef4-4b51-b452-93abecb53041-kube-api-access-fd68j\") pod \"insights-runtime-extractor-8wd72\" (UID: \"9e0754b2-cef4-4b51-b452-93abecb53041\") " pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:08.881856 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:08.881833 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8wd72" Apr 22 20:00:09.005319 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:09.005283 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8wd72"] Apr 22 20:00:09.008966 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:09.008936 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e0754b2_cef4_4b51_b452_93abecb53041.slice/crio-42f932fce5621dabef876778c4abe09ac3bb7ad43ea9264469e3c74a6f99137e WatchSource:0}: Error finding container 42f932fce5621dabef876778c4abe09ac3bb7ad43ea9264469e3c74a6f99137e: Status 404 returned error can't find the container with id 42f932fce5621dabef876778c4abe09ac3bb7ad43ea9264469e3c74a6f99137e Apr 22 20:00:09.395014 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:09.394974 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8wd72" event={"ID":"9e0754b2-cef4-4b51-b452-93abecb53041","Type":"ContainerStarted","Data":"0fbf1233bfc6cb79fa656ede852d8fdfd3240465e400f0ea73804aa49e668e41"} Apr 22 20:00:09.395014 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:09.395012 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8wd72" event={"ID":"9e0754b2-cef4-4b51-b452-93abecb53041","Type":"ContainerStarted","Data":"42f932fce5621dabef876778c4abe09ac3bb7ad43ea9264469e3c74a6f99137e"} Apr 22 20:00:10.400364 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.400329 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8wd72" event={"ID":"9e0754b2-cef4-4b51-b452-93abecb53041","Type":"ContainerStarted","Data":"78f9b3aea842f957b834edb5ca0fc6faad228c193d7d43ba50218d650c5c14e3"} Apr 22 20:00:10.450370 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.450336 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 20:00:10.450370 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.450376 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 20:00:10.453143 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.453111 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b46c2ec-f7aa-4451-90a4-5e3695b9ed78-metrics-tls\") pod \"dns-default-2llws\" (UID: \"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78\") " pod="openshift-dns/dns-default-2llws" Apr 22 20:00:10.453314 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.453294 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c938b1dd-fed3-4797-aab7-2136204f1cd8-cert\") pod \"ingress-canary-kwgk6\" (UID: \"c938b1dd-fed3-4797-aab7-2136204f1cd8\") " pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 20:00:10.707828 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.707794 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2xtr6\"" Apr 22 20:00:10.716765 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.716735 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2llws" Apr 22 20:00:10.723520 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.723449 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g9zd7\"" Apr 22 20:00:10.732039 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.732008 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kwgk6" Apr 22 20:00:10.864674 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.864637 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2llws"] Apr 22 20:00:10.867260 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:10.867225 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b46c2ec_f7aa_4451_90a4_5e3695b9ed78.slice/crio-a53c4708eaaf7a2c68076a5d71a249369113be648e9325633ee50092ee22d607 WatchSource:0}: Error finding container a53c4708eaaf7a2c68076a5d71a249369113be648e9325633ee50092ee22d607: Status 404 returned error can't find the container with id a53c4708eaaf7a2c68076a5d71a249369113be648e9325633ee50092ee22d607 Apr 22 20:00:10.885071 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:10.885044 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kwgk6"] Apr 22 20:00:10.889999 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:10.889974 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc938b1dd_fed3_4797_aab7_2136204f1cd8.slice/crio-4a8a01a292a50e8be6c9aee3952f513db454872fa9f60d13ce8eef488503fd1f WatchSource:0}: Error finding container 4a8a01a292a50e8be6c9aee3952f513db454872fa9f60d13ce8eef488503fd1f: Status 404 returned error can't find the container with id 4a8a01a292a50e8be6c9aee3952f513db454872fa9f60d13ce8eef488503fd1f Apr 22 20:00:11.155520 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:11.155440 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:11.155520 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:11.155510 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:11.156163 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:11.156132 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-service-ca-bundle\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:11.158247 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:11.158224 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7cb3649-fe15-4eeb-a0e1-e3b600a54358-metrics-certs\") pod \"router-default-d7dc7d6db-nq4vr\" (UID: \"a7cb3649-fe15-4eeb-a0e1-e3b600a54358\") " pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:11.403922 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:11.403891 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2llws" event={"ID":"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78","Type":"ContainerStarted","Data":"a53c4708eaaf7a2c68076a5d71a249369113be648e9325633ee50092ee22d607"} Apr 22 20:00:11.404841 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:11.404815 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kwgk6" event={"ID":"c938b1dd-fed3-4797-aab7-2136204f1cd8","Type":"ContainerStarted","Data":"4a8a01a292a50e8be6c9aee3952f513db454872fa9f60d13ce8eef488503fd1f"} Apr 22 20:00:11.408091 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:11.408049 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:11.546929 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:11.546900 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d7dc7d6db-nq4vr"] Apr 22 20:00:11.549406 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:11.549380 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7cb3649_fe15_4eeb_a0e1_e3b600a54358.slice/crio-4bd0a8e18056bcab64cd8e1d3d2f2b380cfead0b4e2587e2bdc898ddc812a4a5 WatchSource:0}: Error finding container 4bd0a8e18056bcab64cd8e1d3d2f2b380cfead0b4e2587e2bdc898ddc812a4a5: Status 404 returned error can't find the container with id 4bd0a8e18056bcab64cd8e1d3d2f2b380cfead0b4e2587e2bdc898ddc812a4a5 Apr 22 20:00:12.409939 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:12.409902 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8wd72" event={"ID":"9e0754b2-cef4-4b51-b452-93abecb53041","Type":"ContainerStarted","Data":"a3b8d84899683a2081f67ab660dbcd13cd90f1ec9db50a0461675df9936ba7ce"} Apr 22 20:00:12.411562 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:12.411529 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" event={"ID":"a7cb3649-fe15-4eeb-a0e1-e3b600a54358","Type":"ContainerStarted","Data":"985a9e1303e2aa299a6ff4aa09149ef5877eb1f4a6d96c838d79004a1f4bcc2f"} Apr 22 20:00:12.411695 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:12.411569 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" event={"ID":"a7cb3649-fe15-4eeb-a0e1-e3b600a54358","Type":"ContainerStarted","Data":"4bd0a8e18056bcab64cd8e1d3d2f2b380cfead0b4e2587e2bdc898ddc812a4a5"} Apr 22 20:00:12.427461 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:12.427411 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8wd72" podStartSLOduration=2.061477733 podStartE2EDuration="4.427398413s" podCreationTimestamp="2026-04-22 20:00:08 +0000 UTC" firstStartedPulling="2026-04-22 20:00:09.063806626 +0000 UTC m=+97.483713396" lastFinishedPulling="2026-04-22 20:00:11.429727308 +0000 UTC m=+99.849634076" observedRunningTime="2026-04-22 20:00:12.426641058 +0000 UTC m=+100.846547854" watchObservedRunningTime="2026-04-22 20:00:12.427398413 +0000 UTC m=+100.847305203" Apr 22 20:00:12.445191 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:12.445132 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" podStartSLOduration=33.445117091 podStartE2EDuration="33.445117091s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:12.444281783 +0000 UTC m=+100.864188585" watchObservedRunningTime="2026-04-22 20:00:12.445117091 +0000 UTC m=+100.865023881" Apr 22 20:00:13.313730 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.313662 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rgxkv" Apr 22 20:00:13.408685 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.408651 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:13.412294 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.412267 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:13.415555 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.415519 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kwgk6" event={"ID":"c938b1dd-fed3-4797-aab7-2136204f1cd8","Type":"ContainerStarted","Data":"ab5a503e0e20caffb0cac005bb5a17dcb7469fce33c0ac85c10ca8480ce77940"} Apr 22 20:00:13.417049 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.417026 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2llws" event={"ID":"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78","Type":"ContainerStarted","Data":"1ec2aa7fbff04145fe9225e19e7bfd4a7a680f320492fbef7711eb1d6b0ea816"} Apr 22 20:00:13.417148 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.417056 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2llws" event={"ID":"7b46c2ec-f7aa-4451-90a4-5e3695b9ed78","Type":"ContainerStarted","Data":"0824d67c3c5782a7c56f7444b440803564aa50b74f2dd2acd035b02c2acfd33f"} Apr 22 20:00:13.417266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.417253 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:13.418303 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.418284 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-d7dc7d6db-nq4vr" Apr 22 20:00:13.446020 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.445931 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kwgk6" podStartSLOduration=65.33745453 podStartE2EDuration="1m7.445917801s" podCreationTimestamp="2026-04-22 19:59:06 +0000 UTC" firstStartedPulling="2026-04-22 20:00:10.892144875 +0000 UTC m=+99.312051654" lastFinishedPulling="2026-04-22 20:00:13.000608117 +0000 UTC m=+101.420514925" observedRunningTime="2026-04-22 20:00:13.445464757 +0000 UTC m=+101.865371569" watchObservedRunningTime="2026-04-22 20:00:13.445917801 +0000 UTC m=+101.865824594" Apr 22 20:00:13.490196 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.490148 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2llws" podStartSLOduration=65.362396399 podStartE2EDuration="1m7.490132507s" podCreationTimestamp="2026-04-22 19:59:06 +0000 UTC" firstStartedPulling="2026-04-22 20:00:10.869664735 +0000 UTC m=+99.289571510" lastFinishedPulling="2026-04-22 20:00:12.997400831 +0000 UTC m=+101.417307618" observedRunningTime="2026-04-22 20:00:13.489650461 +0000 UTC m=+101.909557264" watchObservedRunningTime="2026-04-22 20:00:13.490132507 +0000 UTC m=+101.910039298" Apr 22 20:00:13.678696 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.678667 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts"] Apr 22 20:00:13.681771 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.681755 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" Apr 22 20:00:13.683878 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.683836 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 20:00:13.684011 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.683883 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-blxf4\"" Apr 22 20:00:13.689636 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.689615 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts"] Apr 22 20:00:13.778273 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.778245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8f9f60bf-8244-42b4-942e-45f9cc8a9567-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ls8ts\" (UID: \"8f9f60bf-8244-42b4-942e-45f9cc8a9567\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" Apr 22 20:00:13.878955 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.878850 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8f9f60bf-8244-42b4-942e-45f9cc8a9567-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ls8ts\" (UID: \"8f9f60bf-8244-42b4-942e-45f9cc8a9567\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" Apr 22 20:00:13.881435 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.881417 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8f9f60bf-8244-42b4-942e-45f9cc8a9567-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ls8ts\" (UID: \"8f9f60bf-8244-42b4-942e-45f9cc8a9567\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" Apr 22 20:00:13.990641 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:13.990586 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" Apr 22 20:00:14.109929 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:14.109898 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts"] Apr 22 20:00:14.112765 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:14.112734 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f9f60bf_8244_42b4_942e_45f9cc8a9567.slice/crio-1c08a6937270b0fc30e6e0fd5be737e8732bda788be0abf1e8e2b51df6b790e9 WatchSource:0}: Error finding container 1c08a6937270b0fc30e6e0fd5be737e8732bda788be0abf1e8e2b51df6b790e9: Status 404 returned error can't find the container with id 1c08a6937270b0fc30e6e0fd5be737e8732bda788be0abf1e8e2b51df6b790e9 Apr 22 20:00:14.421177 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:14.421137 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" event={"ID":"8f9f60bf-8244-42b4-942e-45f9cc8a9567","Type":"ContainerStarted","Data":"1c08a6937270b0fc30e6e0fd5be737e8732bda788be0abf1e8e2b51df6b790e9"} Apr 22 20:00:14.421599 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:14.421309 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2llws" Apr 22 20:00:15.424750 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:15.424716 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" event={"ID":"8f9f60bf-8244-42b4-942e-45f9cc8a9567","Type":"ContainerStarted","Data":"00fc18954ec865eece10a9729adef91b51e5ca1e4abe76ed49392623f610b572"} Apr 22 20:00:15.440266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:15.440213 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" podStartSLOduration=1.283635383 podStartE2EDuration="2.440199069s" podCreationTimestamp="2026-04-22 20:00:13 +0000 UTC" firstStartedPulling="2026-04-22 20:00:14.114715442 +0000 UTC m=+102.534622216" lastFinishedPulling="2026-04-22 20:00:15.27127913 +0000 UTC m=+103.691185902" observedRunningTime="2026-04-22 20:00:15.439080273 +0000 UTC m=+103.858987065" watchObservedRunningTime="2026-04-22 20:00:15.440199069 +0000 UTC m=+103.860105860" Apr 22 20:00:16.428688 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:16.428653 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" Apr 22 20:00:16.433325 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:16.433300 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ls8ts" Apr 22 20:00:21.072618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.072582 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc"] Apr 22 20:00:21.075999 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.075982 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.078225 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.078197 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 20:00:21.078335 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.078231 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 20:00:21.078822 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.078799 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 20:00:21.078960 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.078809 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 20:00:21.078960 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.078857 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-7nzdg\"" Apr 22 20:00:21.078960 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.078889 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 20:00:21.083487 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.083466 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc"] Apr 22 20:00:21.096588 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.096567 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dkb7w"] Apr 22 20:00:21.100078 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.100058 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.102181 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.102160 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 20:00:21.102425 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.102409 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 20:00:21.102498 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.102412 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kl442\"" Apr 22 20:00:21.102580 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.102564 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 20:00:21.135784 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135758 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c71d6fd-4e94-4600-b17a-6b70abc22552-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.135784 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135792 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-sys\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.135987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135811 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.135987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135883 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.135987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135919 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-metrics-client-ca\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.135987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135942 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.135987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135959 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.135987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135977 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8454\" (UniqueName: \"kubernetes.io/projected/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-kube-api-access-c8454\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.136170 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.135995 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-root\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.136170 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.136033 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-wtmp\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.136170 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.136055 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4pck\" (UniqueName: \"kubernetes.io/projected/2c71d6fd-4e94-4600-b17a-6b70abc22552-kube-api-access-g4pck\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.136170 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.136118 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-textfile\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.136170 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.136142 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-accelerators-collector-config\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237182 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237148 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-textfile\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237186 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-accelerators-collector-config\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237217 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c71d6fd-4e94-4600-b17a-6b70abc22552-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.237341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237242 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-sys\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237261 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.237341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237288 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237304 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-metrics-client-ca\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237308 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-sys\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237322 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237351 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237378 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8454\" (UniqueName: \"kubernetes.io/projected/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-kube-api-access-c8454\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237402 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-root\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:00:21.237428 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237447 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-wtmp\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237476 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4pck\" (UniqueName: \"kubernetes.io/projected/2c71d6fd-4e94-4600-b17a-6b70abc22552-kube-api-access-g4pck\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:00:21.237498 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls podName:0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc nodeName:}" failed. No retries permitted until 2026-04-22 20:00:21.737479599 +0000 UTC m=+110.157386375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls") pod "node-exporter-dkb7w" (UID: "0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc") : secret "node-exporter-tls" not found Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237594 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-textfile\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.237794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237687 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-root\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.238202 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:00:21.237843 2583 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 20:00:21.238202 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237855 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-wtmp\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.238202 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:00:21.237921 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-tls podName:2c71d6fd-4e94-4600-b17a-6b70abc22552 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:21.737903654 +0000 UTC m=+110.157810436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-lgpsc" (UID: "2c71d6fd-4e94-4600-b17a-6b70abc22552") : secret "openshift-state-metrics-tls" not found Apr 22 20:00:21.238202 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237946 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-accelerators-collector-config\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.238202 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.237955 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-metrics-client-ca\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.238202 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.238002 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2c71d6fd-4e94-4600-b17a-6b70abc22552-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.239980 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.239958 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.240067 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.240011 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.248318 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.248292 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4pck\" (UniqueName: \"kubernetes.io/projected/2c71d6fd-4e94-4600-b17a-6b70abc22552-kube-api-access-g4pck\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.248714 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.248696 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8454\" (UniqueName: \"kubernetes.io/projected/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-kube-api-access-c8454\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.741356 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.741315 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:21.741553 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.741373 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.741553 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:00:21.741464 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 20:00:21.741553 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:00:21.741527 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls podName:0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc nodeName:}" failed. No retries permitted until 2026-04-22 20:00:22.741512555 +0000 UTC m=+111.161419328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls") pod "node-exporter-dkb7w" (UID: "0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc") : secret "node-exporter-tls" not found Apr 22 20:00:21.743778 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.743755 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2c71d6fd-4e94-4600-b17a-6b70abc22552-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lgpsc\" (UID: \"2c71d6fd-4e94-4600-b17a-6b70abc22552\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:21.985427 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:21.985378 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" Apr 22 20:00:22.113845 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.113810 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc"] Apr 22 20:00:22.117540 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:22.117512 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c71d6fd_4e94_4600_b17a_6b70abc22552.slice/crio-85a4d34d866b10b68bbd1278d3cf872a80402b6901da806495d07c31e89f5a6a WatchSource:0}: Error finding container 85a4d34d866b10b68bbd1278d3cf872a80402b6901da806495d07c31e89f5a6a: Status 404 returned error can't find the container with id 85a4d34d866b10b68bbd1278d3cf872a80402b6901da806495d07c31e89f5a6a Apr 22 20:00:22.196461 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.196433 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:22.200847 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.200826 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.203037 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.202879 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-szjbv\"" Apr 22 20:00:22.203037 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.202902 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 20:00:22.203037 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.202940 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 20:00:22.203037 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.202958 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 20:00:22.203298 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.203218 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 20:00:22.203473 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.203445 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 20:00:22.203596 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.203576 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 20:00:22.203665 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.203647 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 20:00:22.203879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.203842 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 20:00:22.203978 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.203915 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 20:00:22.215590 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.215494 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:22.244920 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.244835 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-config-volume\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.244920 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.244911 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245255 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.244978 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245255 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245006 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-web-config\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245255 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245081 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245255 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245148 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245255 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245225 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245255 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245476 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245284 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245476 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245308 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245476 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245329 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-578wg\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-kube-api-access-578wg\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245476 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245349 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.245476 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.245367 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-config-out\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.346497 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.346468 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.346497 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.346502 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.346702 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.346520 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-578wg\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-kube-api-access-578wg\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.346702 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.346551 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.346831 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.346804 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-config-out\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.346940 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.346924 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-config-volume\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.346992 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.346959 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.346992 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.346969 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.347099 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.347039 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.347099 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.347068 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-web-config\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.347192 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.347096 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.347192 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.347128 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.347291 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.347187 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.347291 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.347218 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.348357 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:00:22.348171 2583 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 20:00:22.348357 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:00:22.348254 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls podName:32de9b24-31b3-4dac-9846-058bdd11cecb nodeName:}" failed. No retries permitted until 2026-04-22 20:00:22.848234166 +0000 UTC m=+111.268140953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb") : secret "alertmanager-main-tls" not found Apr 22 20:00:22.348655 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.348475 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.349160 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.349136 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.349808 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.349767 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-config-out\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.350075 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.350055 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.350378 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.350358 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-config-volume\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.350471 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.350453 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.350533 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.350487 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.351083 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.351050 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.351160 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.351059 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.351160 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.351085 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-web-config\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.360482 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.360462 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-578wg\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-kube-api-access-578wg\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.446305 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.446274 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" event={"ID":"2c71d6fd-4e94-4600-b17a-6b70abc22552","Type":"ContainerStarted","Data":"1b05ae84a90c1e901641cf11f53d64ccb876251952431224e744af57b3351113"} Apr 22 20:00:22.446452 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.446314 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" event={"ID":"2c71d6fd-4e94-4600-b17a-6b70abc22552","Type":"ContainerStarted","Data":"a51ba6953ce3534d454e6ae7f8677adc15ce4ff6368322b56602720128c7fcc4"} Apr 22 20:00:22.446452 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.446325 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" event={"ID":"2c71d6fd-4e94-4600-b17a-6b70abc22552","Type":"ContainerStarted","Data":"85a4d34d866b10b68bbd1278d3cf872a80402b6901da806495d07c31e89f5a6a"} Apr 22 20:00:22.751175 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.751134 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:22.753521 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.753487 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc-node-exporter-tls\") pod \"node-exporter-dkb7w\" (UID: \"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc\") " pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:22.851881 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.851829 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.854708 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.854681 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:22.908607 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:22.908575 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dkb7w" Apr 22 20:00:22.916487 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:22.916462 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a45db1e_d88a_441f_9a6c_ce51d7d2cfbc.slice/crio-18092b3ba3e508c3e6c52501dac5a4a8d984abc066074f4666fdeca8d9b43a27 WatchSource:0}: Error finding container 18092b3ba3e508c3e6c52501dac5a4a8d984abc066074f4666fdeca8d9b43a27: Status 404 returned error can't find the container with id 18092b3ba3e508c3e6c52501dac5a4a8d984abc066074f4666fdeca8d9b43a27 Apr 22 20:00:23.123093 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:23.123012 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:23.336279 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:23.336252 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:23.338301 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:23.338257 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32de9b24_31b3_4dac_9846_058bdd11cecb.slice/crio-2275f85a50abd4def02dbbe60ef9ee502f3d9621fc56ec18708adc721dc7faaf WatchSource:0}: Error finding container 2275f85a50abd4def02dbbe60ef9ee502f3d9621fc56ec18708adc721dc7faaf: Status 404 returned error can't find the container with id 2275f85a50abd4def02dbbe60ef9ee502f3d9621fc56ec18708adc721dc7faaf Apr 22 20:00:23.450888 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:23.450820 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dkb7w" event={"ID":"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc","Type":"ContainerStarted","Data":"18092b3ba3e508c3e6c52501dac5a4a8d984abc066074f4666fdeca8d9b43a27"} Apr 22 20:00:23.452116 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:23.452083 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerStarted","Data":"2275f85a50abd4def02dbbe60ef9ee502f3d9621fc56ec18708adc721dc7faaf"} Apr 22 20:00:23.453925 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:23.453894 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" event={"ID":"2c71d6fd-4e94-4600-b17a-6b70abc22552","Type":"ContainerStarted","Data":"8d6f863e60374d7972f9abbb34390f0eee8572bafa101ed31a232af3d09c6f54"} Apr 22 20:00:23.470555 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:23.470500 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lgpsc" podStartSLOduration=1.460250533 podStartE2EDuration="2.470485027s" podCreationTimestamp="2026-04-22 20:00:21 +0000 UTC" firstStartedPulling="2026-04-22 20:00:22.243253127 +0000 UTC m=+110.663159911" lastFinishedPulling="2026-04-22 20:00:23.253487622 +0000 UTC m=+111.673394405" observedRunningTime="2026-04-22 20:00:23.469695344 +0000 UTC m=+111.889602146" watchObservedRunningTime="2026-04-22 20:00:23.470485027 +0000 UTC m=+111.890391818" Apr 22 20:00:24.427733 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:24.427706 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2llws" Apr 22 20:00:24.459520 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:24.459482 2583 generic.go:358] "Generic (PLEG): container finished" podID="0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc" containerID="b09ac1e1b5654f0516aa96b0b945d3c9aec281f162cb1ba1bc334419f47cb6b3" exitCode=0 Apr 22 20:00:24.460923 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:24.460893 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dkb7w" event={"ID":"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc","Type":"ContainerDied","Data":"b09ac1e1b5654f0516aa96b0b945d3c9aec281f162cb1ba1bc334419f47cb6b3"} Apr 22 20:00:25.463881 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.463832 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dkb7w" event={"ID":"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc","Type":"ContainerStarted","Data":"04a53bfe98784907b32ce1cdc0cb2eb61e92772e58f3dda89d65b9c994618666"} Apr 22 20:00:25.464324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.463905 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dkb7w" event={"ID":"0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc","Type":"ContainerStarted","Data":"c33c00bb0c1fefe5e46a5cc363dca6601717fe87361bbb4fac836b953cc38111"} Apr 22 20:00:25.465399 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.465375 2583 generic.go:358] "Generic (PLEG): container finished" podID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerID="954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f" exitCode=0 Apr 22 20:00:25.465511 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.465446 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerDied","Data":"954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f"} Apr 22 20:00:25.485023 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.484979 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dkb7w" podStartSLOduration=3.763616639 podStartE2EDuration="4.484966635s" podCreationTimestamp="2026-04-22 20:00:21 +0000 UTC" firstStartedPulling="2026-04-22 20:00:22.917890482 +0000 UTC m=+111.337797251" lastFinishedPulling="2026-04-22 20:00:23.639240475 +0000 UTC m=+112.059147247" observedRunningTime="2026-04-22 20:00:25.483550894 +0000 UTC m=+113.903457686" watchObservedRunningTime="2026-04-22 20:00:25.484966635 +0000 UTC m=+113.904873425" Apr 22 20:00:25.531082 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.531052 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-74c8dd4d84-45lt5"] Apr 22 20:00:25.534231 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.534216 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.537020 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.537000 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 20:00:25.537125 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.537100 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 20:00:25.537192 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.537173 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-8fe3urchtm19i\"" Apr 22 20:00:25.537243 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.537203 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 20:00:25.537243 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.537215 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 20:00:25.537243 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.537219 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-mvlwh\"" Apr 22 20:00:25.542597 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.542575 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-74c8dd4d84-45lt5"] Apr 22 20:00:25.575263 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.575239 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a135e0bd-d074-49d7-88ff-c4450222dcd8-audit-log\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.575384 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.575271 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a135e0bd-d074-49d7-88ff-c4450222dcd8-metrics-server-audit-profiles\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.575838 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.575813 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a135e0bd-d074-49d7-88ff-c4450222dcd8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.576138 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.575841 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-client-ca-bundle\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.576138 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.575921 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-secret-metrics-server-client-certs\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.576138 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.576043 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-secret-metrics-server-tls\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.576138 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.576092 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbvzv\" (UniqueName: \"kubernetes.io/projected/a135e0bd-d074-49d7-88ff-c4450222dcd8-kube-api-access-wbvzv\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.676590 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.676555 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a135e0bd-d074-49d7-88ff-c4450222dcd8-metrics-server-audit-profiles\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.676750 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.676615 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a135e0bd-d074-49d7-88ff-c4450222dcd8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.676750 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.676733 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-client-ca-bundle\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.676821 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.676781 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-secret-metrics-server-client-certs\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.676854 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.676828 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-secret-metrics-server-tls\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.676926 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.676897 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbvzv\" (UniqueName: \"kubernetes.io/projected/a135e0bd-d074-49d7-88ff-c4450222dcd8-kube-api-access-wbvzv\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.676974 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.676949 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a135e0bd-d074-49d7-88ff-c4450222dcd8-audit-log\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.677290 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.677245 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a135e0bd-d074-49d7-88ff-c4450222dcd8-audit-log\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.677414 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.677377 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a135e0bd-d074-49d7-88ff-c4450222dcd8-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.677684 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.677664 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a135e0bd-d074-49d7-88ff-c4450222dcd8-metrics-server-audit-profiles\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.679467 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.679447 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-secret-metrics-server-tls\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.679549 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.679482 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-client-ca-bundle\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.679549 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.679522 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a135e0bd-d074-49d7-88ff-c4450222dcd8-secret-metrics-server-client-certs\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.686324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.686303 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbvzv\" (UniqueName: \"kubernetes.io/projected/a135e0bd-d074-49d7-88ff-c4450222dcd8-kube-api-access-wbvzv\") pod \"metrics-server-74c8dd4d84-45lt5\" (UID: \"a135e0bd-d074-49d7-88ff-c4450222dcd8\") " pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.844138 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.844067 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:25.860311 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.860199 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln"] Apr 22 20:00:25.864922 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.864901 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" Apr 22 20:00:25.867287 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.867263 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 20:00:25.867427 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.867268 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-cw9wj\"" Apr 22 20:00:25.874617 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.874596 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln"] Apr 22 20:00:25.878186 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.878162 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nbqln\" (UID: \"7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" Apr 22 20:00:25.969177 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.969155 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-74c8dd4d84-45lt5"] Apr 22 20:00:25.971551 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:25.971525 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda135e0bd_d074_49d7_88ff_c4450222dcd8.slice/crio-672e04c6da9e3a81c51ee0e3049d5cb4b8c31cfa0dac8730b8c3b0383a6ded83 WatchSource:0}: Error finding container 672e04c6da9e3a81c51ee0e3049d5cb4b8c31cfa0dac8730b8c3b0383a6ded83: Status 404 returned error can't find the container with id 672e04c6da9e3a81c51ee0e3049d5cb4b8c31cfa0dac8730b8c3b0383a6ded83 Apr 22 20:00:25.979208 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.979187 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nbqln\" (UID: \"7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" Apr 22 20:00:25.981592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:25.981573 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nbqln\" (UID: \"7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" Apr 22 20:00:26.178210 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.178128 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" Apr 22 20:00:26.322590 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.322479 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln"] Apr 22 20:00:26.325606 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:26.325563 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fdabb89_cd2c_43cf_b2d4_6ff45dd2cad9.slice/crio-4fbb4cc81ca5823a0ffa0221ed22e64754d207168e1125c3d73a900b69d632c4 WatchSource:0}: Error finding container 4fbb4cc81ca5823a0ffa0221ed22e64754d207168e1125c3d73a900b69d632c4: Status 404 returned error can't find the container with id 4fbb4cc81ca5823a0ffa0221ed22e64754d207168e1125c3d73a900b69d632c4 Apr 22 20:00:26.345515 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.345486 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-69d7b59889-r8587"] Apr 22 20:00:26.350563 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.350535 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.352760 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.352735 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 20:00:26.352890 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.352812 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-2kn2q\"" Apr 22 20:00:26.352966 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.352925 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 20:00:26.353091 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.353064 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 20:00:26.353209 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.353174 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 20:00:26.353209 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.353176 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 20:00:26.364921 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.364895 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 20:00:26.367745 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.367711 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-69d7b59889-r8587"] Apr 22 20:00:26.383294 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.383266 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-serving-certs-ca-bundle\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.383545 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.383520 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-secret-telemeter-client\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.383652 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.383565 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-telemeter-client-tls\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.383652 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.383594 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-metrics-client-ca\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.383808 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.383655 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.383808 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.383682 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjb8t\" (UniqueName: \"kubernetes.io/projected/6b512aec-107a-4638-b0fc-0230936d2e16-kube-api-access-rjb8t\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.383808 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.383710 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-federate-client-tls\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.383808 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.383736 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.469651 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.469612 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" event={"ID":"7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9","Type":"ContainerStarted","Data":"4fbb4cc81ca5823a0ffa0221ed22e64754d207168e1125c3d73a900b69d632c4"} Apr 22 20:00:26.470848 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.470811 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" event={"ID":"a135e0bd-d074-49d7-88ff-c4450222dcd8","Type":"ContainerStarted","Data":"672e04c6da9e3a81c51ee0e3049d5cb4b8c31cfa0dac8730b8c3b0383a6ded83"} Apr 22 20:00:26.484527 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.484500 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-serving-certs-ca-bundle\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.484662 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.484581 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-secret-telemeter-client\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.484662 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.484611 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-telemeter-client-tls\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.484662 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.484635 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-metrics-client-ca\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.484826 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.484689 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.484826 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.484718 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjb8t\" (UniqueName: \"kubernetes.io/projected/6b512aec-107a-4638-b0fc-0230936d2e16-kube-api-access-rjb8t\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.484826 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.484746 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-federate-client-tls\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.484826 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.484771 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.485367 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.485314 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-serving-certs-ca-bundle\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.485526 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.485502 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-metrics-client-ca\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.485708 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.485663 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b512aec-107a-4638-b0fc-0230936d2e16-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.488164 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.488140 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.488241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.488221 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-secret-telemeter-client\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.488462 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.488445 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-telemeter-client-tls\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.488505 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.488474 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6b512aec-107a-4638-b0fc-0230936d2e16-federate-client-tls\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.493290 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.493246 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjb8t\" (UniqueName: \"kubernetes.io/projected/6b512aec-107a-4638-b0fc-0230936d2e16-kube-api-access-rjb8t\") pod \"telemeter-client-69d7b59889-r8587\" (UID: \"6b512aec-107a-4638-b0fc-0230936d2e16\") " pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.662613 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.662573 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" Apr 22 20:00:26.814997 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:26.814963 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-69d7b59889-r8587"] Apr 22 20:00:27.150310 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:27.150232 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b512aec_107a_4638_b0fc_0230936d2e16.slice/crio-523a0d9ad5f41bd3cf3eaca674358424b5ba316e3886cc0aed19e63890238a95 WatchSource:0}: Error finding container 523a0d9ad5f41bd3cf3eaca674358424b5ba316e3886cc0aed19e63890238a95: Status 404 returned error can't find the container with id 523a0d9ad5f41bd3cf3eaca674358424b5ba316e3886cc0aed19e63890238a95 Apr 22 20:00:27.475239 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:27.475196 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" event={"ID":"6b512aec-107a-4638-b0fc-0230936d2e16","Type":"ContainerStarted","Data":"523a0d9ad5f41bd3cf3eaca674358424b5ba316e3886cc0aed19e63890238a95"} Apr 22 20:00:28.484476 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.484432 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerStarted","Data":"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664"} Apr 22 20:00:28.484476 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.484472 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerStarted","Data":"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea"} Apr 22 20:00:28.485121 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.484487 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerStarted","Data":"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122"} Apr 22 20:00:28.485121 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.484500 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerStarted","Data":"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc"} Apr 22 20:00:28.485121 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.484512 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerStarted","Data":"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f"} Apr 22 20:00:28.486129 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.486100 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" event={"ID":"a135e0bd-d074-49d7-88ff-c4450222dcd8","Type":"ContainerStarted","Data":"2a804621d145ac2a5ffcf938a47d3850bedffafced9d31540b2852ff30a8dc58"} Apr 22 20:00:28.487722 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.487599 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" event={"ID":"7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9","Type":"ContainerStarted","Data":"bc23162130e331296d11c49f16d757ad558720996088395d8eef4dd8fea30d7d"} Apr 22 20:00:28.487911 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.487844 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" Apr 22 20:00:28.493183 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.493161 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" Apr 22 20:00:28.503980 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.503932 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" podStartSLOduration=1.539944118 podStartE2EDuration="3.503920771s" podCreationTimestamp="2026-04-22 20:00:25 +0000 UTC" firstStartedPulling="2026-04-22 20:00:25.973466819 +0000 UTC m=+114.393373588" lastFinishedPulling="2026-04-22 20:00:27.937443459 +0000 UTC m=+116.357350241" observedRunningTime="2026-04-22 20:00:28.503193605 +0000 UTC m=+116.923100396" watchObservedRunningTime="2026-04-22 20:00:28.503920771 +0000 UTC m=+116.923827561" Apr 22 20:00:28.521819 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:28.521767 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nbqln" podStartSLOduration=1.910455612 podStartE2EDuration="3.521751119s" podCreationTimestamp="2026-04-22 20:00:25 +0000 UTC" firstStartedPulling="2026-04-22 20:00:26.328058419 +0000 UTC m=+114.747965188" lastFinishedPulling="2026-04-22 20:00:27.939353915 +0000 UTC m=+116.359260695" observedRunningTime="2026-04-22 20:00:28.520632138 +0000 UTC m=+116.940538928" watchObservedRunningTime="2026-04-22 20:00:28.521751119 +0000 UTC m=+116.941657911" Apr 22 20:00:30.498329 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:30.498293 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerStarted","Data":"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315"} Apr 22 20:00:30.500582 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:30.500548 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" event={"ID":"6b512aec-107a-4638-b0fc-0230936d2e16","Type":"ContainerStarted","Data":"ecf738ac49eb9109e6ded45da46f13a5b1f4df8b4dcecdf216f93a2a1f728530"} Apr 22 20:00:30.500582 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:30.500582 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" event={"ID":"6b512aec-107a-4638-b0fc-0230936d2e16","Type":"ContainerStarted","Data":"c4c59a8959124eb54263520bc2156e78bf2d88bab00f5d6e6be6fa9d66f2d3c2"} Apr 22 20:00:30.500751 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:30.500593 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" event={"ID":"6b512aec-107a-4638-b0fc-0230936d2e16","Type":"ContainerStarted","Data":"1094904d37fe1828a919e90627c9ac9738cd47f5fecd2c794f1ac41b71f44cf1"} Apr 22 20:00:30.525776 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:30.525728 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.379275623 podStartE2EDuration="8.525714491s" podCreationTimestamp="2026-04-22 20:00:22 +0000 UTC" firstStartedPulling="2026-04-22 20:00:23.341586502 +0000 UTC m=+111.761493274" lastFinishedPulling="2026-04-22 20:00:29.488025373 +0000 UTC m=+117.907932142" observedRunningTime="2026-04-22 20:00:30.523679345 +0000 UTC m=+118.943586136" watchObservedRunningTime="2026-04-22 20:00:30.525714491 +0000 UTC m=+118.945621281" Apr 22 20:00:30.544516 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:30.544463 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-69d7b59889-r8587" podStartSLOduration=2.204686947 podStartE2EDuration="4.54444844s" podCreationTimestamp="2026-04-22 20:00:26 +0000 UTC" firstStartedPulling="2026-04-22 20:00:27.15270748 +0000 UTC m=+115.572614249" lastFinishedPulling="2026-04-22 20:00:29.492468958 +0000 UTC m=+117.912375742" observedRunningTime="2026-04-22 20:00:30.542813056 +0000 UTC m=+118.962719848" watchObservedRunningTime="2026-04-22 20:00:30.54444844 +0000 UTC m=+118.964355230" Apr 22 20:00:35.007505 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.007463 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-78cpf"] Apr 22 20:00:35.010878 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.010837 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-78cpf" Apr 22 20:00:35.012985 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.012966 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-8tgvs\"" Apr 22 20:00:35.013297 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.013282 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 20:00:35.013394 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.013380 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 20:00:35.018452 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.018431 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-78cpf"] Apr 22 20:00:35.061758 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.061720 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jhx\" (UniqueName: \"kubernetes.io/projected/25a7c1e2-d706-4d4a-8ee5-197c8eec5993-kube-api-access-87jhx\") pod \"downloads-6bcc868b7-78cpf\" (UID: \"25a7c1e2-d706-4d4a-8ee5-197c8eec5993\") " pod="openshift-console/downloads-6bcc868b7-78cpf" Apr 22 20:00:35.162382 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.162341 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87jhx\" (UniqueName: \"kubernetes.io/projected/25a7c1e2-d706-4d4a-8ee5-197c8eec5993-kube-api-access-87jhx\") pod \"downloads-6bcc868b7-78cpf\" (UID: \"25a7c1e2-d706-4d4a-8ee5-197c8eec5993\") " pod="openshift-console/downloads-6bcc868b7-78cpf" Apr 22 20:00:35.171108 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.171075 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jhx\" (UniqueName: \"kubernetes.io/projected/25a7c1e2-d706-4d4a-8ee5-197c8eec5993-kube-api-access-87jhx\") pod \"downloads-6bcc868b7-78cpf\" (UID: \"25a7c1e2-d706-4d4a-8ee5-197c8eec5993\") " pod="openshift-console/downloads-6bcc868b7-78cpf" Apr 22 20:00:35.320987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.320888 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-78cpf" Apr 22 20:00:35.445560 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.445455 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-78cpf"] Apr 22 20:00:35.448397 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:35.448366 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25a7c1e2_d706_4d4a_8ee5_197c8eec5993.slice/crio-9a2b9cd48d4c459023a060fab4a936ab1020ad6a827d2d600d21851d331ce058 WatchSource:0}: Error finding container 9a2b9cd48d4c459023a060fab4a936ab1020ad6a827d2d600d21851d331ce058: Status 404 returned error can't find the container with id 9a2b9cd48d4c459023a060fab4a936ab1020ad6a827d2d600d21851d331ce058 Apr 22 20:00:35.515282 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:35.515246 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-78cpf" event={"ID":"25a7c1e2-d706-4d4a-8ee5-197c8eec5993","Type":"ContainerStarted","Data":"9a2b9cd48d4c459023a060fab4a936ab1020ad6a827d2d600d21851d331ce058"} Apr 22 20:00:41.825856 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:41.825802 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 20:00:41.828730 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:41.828702 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fbf58ad5-56ae-4535-a07f-980865760128-metrics-certs\") pod \"network-metrics-daemon-q6lbk\" (UID: \"fbf58ad5-56ae-4535-a07f-980865760128\") " pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 20:00:41.904387 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:41.904353 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vnzkf\"" Apr 22 20:00:41.912242 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:41.912217 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q6lbk" Apr 22 20:00:42.043665 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:42.043526 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q6lbk"] Apr 22 20:00:42.046885 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:42.046828 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbf58ad5_56ae_4535_a07f_980865760128.slice/crio-7686aa58673175f0ccd6a4418aa281dde0f94bd6cf782099b5cb81ac96ae6158 WatchSource:0}: Error finding container 7686aa58673175f0ccd6a4418aa281dde0f94bd6cf782099b5cb81ac96ae6158: Status 404 returned error can't find the container with id 7686aa58673175f0ccd6a4418aa281dde0f94bd6cf782099b5cb81ac96ae6158 Apr 22 20:00:42.539173 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:42.539130 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q6lbk" event={"ID":"fbf58ad5-56ae-4535-a07f-980865760128","Type":"ContainerStarted","Data":"7686aa58673175f0ccd6a4418aa281dde0f94bd6cf782099b5cb81ac96ae6158"} Apr 22 20:00:43.544596 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:43.544557 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q6lbk" event={"ID":"fbf58ad5-56ae-4535-a07f-980865760128","Type":"ContainerStarted","Data":"c89dc861dcf786d1e2e5e9fe1cb1aed4c0d1e01f2abde05db0ad247dc5fad177"} Apr 22 20:00:43.544596 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:43.544598 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q6lbk" event={"ID":"fbf58ad5-56ae-4535-a07f-980865760128","Type":"ContainerStarted","Data":"3c1714a127b91f87106ef8a5be3974daacabd71fcedc6cda211de2e2f704d1b6"} Apr 22 20:00:43.564513 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:43.564458 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q6lbk" podStartSLOduration=130.547232495 podStartE2EDuration="2m11.564442152s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 20:00:42.049129065 +0000 UTC m=+130.469035838" lastFinishedPulling="2026-04-22 20:00:43.066338725 +0000 UTC m=+131.486245495" observedRunningTime="2026-04-22 20:00:43.561825676 +0000 UTC m=+131.981732494" watchObservedRunningTime="2026-04-22 20:00:43.564442152 +0000 UTC m=+131.984348943" Apr 22 20:00:45.844336 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:45.844302 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:45.844762 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:45.844348 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:00:47.103739 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.103705 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59ddc6565f-t2dq6"] Apr 22 20:00:47.137962 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.137929 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59ddc6565f-t2dq6"] Apr 22 20:00:47.138179 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.138072 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.140751 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.140727 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 20:00:47.141589 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.141561 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 20:00:47.141589 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.141578 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 20:00:47.141740 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.141600 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 20:00:47.141740 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.141561 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 20:00:47.141740 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.141579 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-25v8j\"" Apr 22 20:00:47.273544 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.273507 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-console-config\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.273544 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.273547 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-service-ca\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.273780 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.273590 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-serving-cert\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.273780 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.273640 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-oauth-config\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.273780 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.273668 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4s88\" (UniqueName: \"kubernetes.io/projected/9eea08ec-8c00-4216-9596-b85403a43834-kube-api-access-f4s88\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.273780 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.273722 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-oauth-serving-cert\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.374626 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.374539 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-console-config\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.374626 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.374585 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-service-ca\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.374626 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.374624 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-serving-cert\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.374947 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.374657 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-oauth-config\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.374947 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.374685 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4s88\" (UniqueName: \"kubernetes.io/projected/9eea08ec-8c00-4216-9596-b85403a43834-kube-api-access-f4s88\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.374947 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.374714 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-oauth-serving-cert\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.375487 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.375462 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-oauth-serving-cert\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.375595 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.375449 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-service-ca\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.375673 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.375464 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-console-config\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.377590 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.377567 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-oauth-config\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.377724 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.377692 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-serving-cert\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.382324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.382300 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4s88\" (UniqueName: \"kubernetes.io/projected/9eea08ec-8c00-4216-9596-b85403a43834-kube-api-access-f4s88\") pod \"console-59ddc6565f-t2dq6\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:47.450294 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:47.450240 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:51.503985 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.503959 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59ddc6565f-t2dq6"] Apr 22 20:00:51.507529 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:51.507500 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eea08ec_8c00_4216_9596_b85403a43834.slice/crio-6d81cec10e48ee6d41edace0fa747b3cc88c587b3894057883c0430d4f50f1f8 WatchSource:0}: Error finding container 6d81cec10e48ee6d41edace0fa747b3cc88c587b3894057883c0430d4f50f1f8: Status 404 returned error can't find the container with id 6d81cec10e48ee6d41edace0fa747b3cc88c587b3894057883c0430d4f50f1f8 Apr 22 20:00:51.575683 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.575649 2583 generic.go:358] "Generic (PLEG): container finished" podID="408c985c-a3f6-453e-9cd5-3acc576e7673" containerID="8ac0eed911d116ce098d6b3ba37b87da78851721217285a93026d7b359f89bd8" exitCode=0 Apr 22 20:00:51.575797 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.575717 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" event={"ID":"408c985c-a3f6-453e-9cd5-3acc576e7673","Type":"ContainerDied","Data":"8ac0eed911d116ce098d6b3ba37b87da78851721217285a93026d7b359f89bd8"} Apr 22 20:00:51.576146 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.576125 2583 scope.go:117] "RemoveContainer" containerID="8ac0eed911d116ce098d6b3ba37b87da78851721217285a93026d7b359f89bd8" Apr 22 20:00:51.577223 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.577198 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-78cpf" event={"ID":"25a7c1e2-d706-4d4a-8ee5-197c8eec5993","Type":"ContainerStarted","Data":"a29960b5c7919a265030698fb9977049d48d5ee7eaa6d850882d2b8f969dfc36"} Apr 22 20:00:51.577441 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.577410 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-78cpf" Apr 22 20:00:51.578504 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.578483 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59ddc6565f-t2dq6" event={"ID":"9eea08ec-8c00-4216-9596-b85403a43834","Type":"ContainerStarted","Data":"6d81cec10e48ee6d41edace0fa747b3cc88c587b3894057883c0430d4f50f1f8"} Apr 22 20:00:51.579167 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.579134 2583 patch_prober.go:28] interesting pod/downloads-6bcc868b7-78cpf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.21:8080/\": dial tcp 10.134.0.21:8080: connect: connection refused" start-of-body= Apr 22 20:00:51.579287 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.579190 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-78cpf" podUID="25a7c1e2-d706-4d4a-8ee5-197c8eec5993" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.21:8080/\": dial tcp 10.134.0.21:8080: connect: connection refused" Apr 22 20:00:51.604934 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:51.604888 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-78cpf" podStartSLOduration=1.572173335 podStartE2EDuration="17.604874356s" podCreationTimestamp="2026-04-22 20:00:34 +0000 UTC" firstStartedPulling="2026-04-22 20:00:35.450645331 +0000 UTC m=+123.870552105" lastFinishedPulling="2026-04-22 20:00:51.483346352 +0000 UTC m=+139.903253126" observedRunningTime="2026-04-22 20:00:51.603634505 +0000 UTC m=+140.023541297" watchObservedRunningTime="2026-04-22 20:00:51.604874356 +0000 UTC m=+140.024781138" Apr 22 20:00:52.585525 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:52.585479 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vwfsv" event={"ID":"408c985c-a3f6-453e-9cd5-3acc576e7673","Type":"ContainerStarted","Data":"65cdc258fa8c4861afcf71e451e1a78956e148e0ef0570a18f3557696f52b05b"} Apr 22 20:00:52.604284 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:52.604230 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-78cpf" Apr 22 20:00:55.309293 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.309260 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f5b686dbd-7w942"] Apr 22 20:00:55.335917 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.335884 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5b686dbd-7w942"] Apr 22 20:00:55.336082 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.336050 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.342791 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.342762 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 20:00:55.453579 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.453539 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-oauth-serving-cert\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.453691 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.453631 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55zw2\" (UniqueName: \"kubernetes.io/projected/9147f1c1-a7c3-4865-89aa-aa7083bb2032-kube-api-access-55zw2\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.453691 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.453669 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-service-ca\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.453829 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.453805 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-oauth-config\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.453993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.453854 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-serving-cert\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.453993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.453935 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-config\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.453993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.453983 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-trusted-ca-bundle\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.554739 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.554700 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-trusted-ca-bundle\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.554949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.554758 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-oauth-serving-cert\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.554949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.554800 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55zw2\" (UniqueName: \"kubernetes.io/projected/9147f1c1-a7c3-4865-89aa-aa7083bb2032-kube-api-access-55zw2\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.554949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.554838 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-service-ca\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.554949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.554911 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-oauth-config\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.554949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.554944 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-serving-cert\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.555227 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.554983 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-config\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.555613 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.555579 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-oauth-serving-cert\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.555716 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.555632 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-config\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.555716 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.555657 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-trusted-ca-bundle\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.555835 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.555800 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-service-ca\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.557955 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.557919 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-oauth-config\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.558055 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.558015 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-serving-cert\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.562823 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.562771 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55zw2\" (UniqueName: \"kubernetes.io/projected/9147f1c1-a7c3-4865-89aa-aa7083bb2032-kube-api-access-55zw2\") pod \"console-7f5b686dbd-7w942\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.597925 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.597886 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59ddc6565f-t2dq6" event={"ID":"9eea08ec-8c00-4216-9596-b85403a43834","Type":"ContainerStarted","Data":"6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb"} Apr 22 20:00:55.613500 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.613439 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59ddc6565f-t2dq6" podStartSLOduration=4.766380467 podStartE2EDuration="8.613420075s" podCreationTimestamp="2026-04-22 20:00:47 +0000 UTC" firstStartedPulling="2026-04-22 20:00:51.509581369 +0000 UTC m=+139.929488139" lastFinishedPulling="2026-04-22 20:00:55.356620963 +0000 UTC m=+143.776527747" observedRunningTime="2026-04-22 20:00:55.612574508 +0000 UTC m=+144.032481300" watchObservedRunningTime="2026-04-22 20:00:55.613420075 +0000 UTC m=+144.033326868" Apr 22 20:00:55.648453 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.648425 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:00:55.787375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:55.787332 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5b686dbd-7w942"] Apr 22 20:00:55.791062 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:00:55.791035 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9147f1c1_a7c3_4865_89aa_aa7083bb2032.slice/crio-5bf4a8b3c5f56fd31cf632798b9f225920ca133d71a9e84751fbb01bb785f0de WatchSource:0}: Error finding container 5bf4a8b3c5f56fd31cf632798b9f225920ca133d71a9e84751fbb01bb785f0de: Status 404 returned error can't find the container with id 5bf4a8b3c5f56fd31cf632798b9f225920ca133d71a9e84751fbb01bb785f0de Apr 22 20:00:56.603358 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:56.603313 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5b686dbd-7w942" event={"ID":"9147f1c1-a7c3-4865-89aa-aa7083bb2032","Type":"ContainerStarted","Data":"0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4"} Apr 22 20:00:56.603358 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:56.603363 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5b686dbd-7w942" event={"ID":"9147f1c1-a7c3-4865-89aa-aa7083bb2032","Type":"ContainerStarted","Data":"5bf4a8b3c5f56fd31cf632798b9f225920ca133d71a9e84751fbb01bb785f0de"} Apr 22 20:00:56.619728 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:56.619672 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f5b686dbd-7w942" podStartSLOduration=1.619640796 podStartE2EDuration="1.619640796s" podCreationTimestamp="2026-04-22 20:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:56.618623032 +0000 UTC m=+145.038529822" watchObservedRunningTime="2026-04-22 20:00:56.619640796 +0000 UTC m=+145.039547589" Apr 22 20:00:57.450502 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:57.450468 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:57.450682 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:57.450514 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:57.456262 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:57.456233 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:00:57.611539 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:00:57.611509 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:01:00.619180 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:00.619091 2583 generic.go:358] "Generic (PLEG): container finished" podID="93becda5-df4b-41f0-954c-ed611504c70c" containerID="9ecef90c4b5ea802e9acd0ef087be3a9b28e1adf4df4a486567817b6719c3c5b" exitCode=0 Apr 22 20:01:00.619180 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:00.619128 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-br2nv" event={"ID":"93becda5-df4b-41f0-954c-ed611504c70c","Type":"ContainerDied","Data":"9ecef90c4b5ea802e9acd0ef087be3a9b28e1adf4df4a486567817b6719c3c5b"} Apr 22 20:01:00.619703 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:00.619505 2583 scope.go:117] "RemoveContainer" containerID="9ecef90c4b5ea802e9acd0ef087be3a9b28e1adf4df4a486567817b6719c3c5b" Apr 22 20:01:01.623931 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:01.623896 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-br2nv" event={"ID":"93becda5-df4b-41f0-954c-ed611504c70c","Type":"ContainerStarted","Data":"191b9fbdaa0da48042c9ee74e469a6384f81e8958bb550a610393b55d99316cb"} Apr 22 20:01:04.633791 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:04.633705 2583 generic.go:358] "Generic (PLEG): container finished" podID="32c9ae24-cac4-4927-a2d1-a0542ac9e54d" containerID="a0163fd8b76649ef59a757122a48d1db4412212bcab1df8b346e6fc4f4858e1b" exitCode=0 Apr 22 20:01:04.634174 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:04.633775 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" event={"ID":"32c9ae24-cac4-4927-a2d1-a0542ac9e54d","Type":"ContainerDied","Data":"a0163fd8b76649ef59a757122a48d1db4412212bcab1df8b346e6fc4f4858e1b"} Apr 22 20:01:04.634174 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:04.634123 2583 scope.go:117] "RemoveContainer" containerID="a0163fd8b76649ef59a757122a48d1db4412212bcab1df8b346e6fc4f4858e1b" Apr 22 20:01:05.638394 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:05.638362 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csj2c" event={"ID":"32c9ae24-cac4-4927-a2d1-a0542ac9e54d","Type":"ContainerStarted","Data":"371f6dc28506569f6269a79f399df59e8d5888340803684bc933f924e5bdfaab"} Apr 22 20:01:05.649329 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:05.649304 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:01:05.649452 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:05.649337 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:01:05.654415 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:05.654389 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:01:05.850402 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:05.850367 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:01:05.858592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:05.858564 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-74c8dd4d84-45lt5" Apr 22 20:01:06.645614 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:06.645583 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:01:06.686212 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:06.686174 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59ddc6565f-t2dq6"] Apr 22 20:01:31.706748 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:31.706631 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59ddc6565f-t2dq6" podUID="9eea08ec-8c00-4216-9596-b85403a43834" containerName="console" containerID="cri-o://6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb" gracePeriod=15 Apr 22 20:01:31.970852 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:31.970828 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59ddc6565f-t2dq6_9eea08ec-8c00-4216-9596-b85403a43834/console/0.log" Apr 22 20:01:31.970992 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:31.970925 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:01:32.071453 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071417 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-oauth-serving-cert\") pod \"9eea08ec-8c00-4216-9596-b85403a43834\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " Apr 22 20:01:32.071602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071480 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-console-config\") pod \"9eea08ec-8c00-4216-9596-b85403a43834\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " Apr 22 20:01:32.071602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071504 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-service-ca\") pod \"9eea08ec-8c00-4216-9596-b85403a43834\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " Apr 22 20:01:32.071602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071531 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-serving-cert\") pod \"9eea08ec-8c00-4216-9596-b85403a43834\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " Apr 22 20:01:32.071602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071585 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-oauth-config\") pod \"9eea08ec-8c00-4216-9596-b85403a43834\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " Apr 22 20:01:32.071807 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071624 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4s88\" (UniqueName: \"kubernetes.io/projected/9eea08ec-8c00-4216-9596-b85403a43834-kube-api-access-f4s88\") pod \"9eea08ec-8c00-4216-9596-b85403a43834\" (UID: \"9eea08ec-8c00-4216-9596-b85403a43834\") " Apr 22 20:01:32.071921 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071888 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-console-config" (OuterVolumeSpecName: "console-config") pod "9eea08ec-8c00-4216-9596-b85403a43834" (UID: "9eea08ec-8c00-4216-9596-b85403a43834"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:32.071986 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071897 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-service-ca" (OuterVolumeSpecName: "service-ca") pod "9eea08ec-8c00-4216-9596-b85403a43834" (UID: "9eea08ec-8c00-4216-9596-b85403a43834"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:32.071986 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.071901 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9eea08ec-8c00-4216-9596-b85403a43834" (UID: "9eea08ec-8c00-4216-9596-b85403a43834"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:32.074268 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.074236 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9eea08ec-8c00-4216-9596-b85403a43834" (UID: "9eea08ec-8c00-4216-9596-b85403a43834"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:32.074392 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.074269 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eea08ec-8c00-4216-9596-b85403a43834-kube-api-access-f4s88" (OuterVolumeSpecName: "kube-api-access-f4s88") pod "9eea08ec-8c00-4216-9596-b85403a43834" (UID: "9eea08ec-8c00-4216-9596-b85403a43834"). InnerVolumeSpecName "kube-api-access-f4s88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:32.074392 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.074285 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9eea08ec-8c00-4216-9596-b85403a43834" (UID: "9eea08ec-8c00-4216-9596-b85403a43834"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:32.172408 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.172373 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-oauth-serving-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:32.172408 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.172409 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-console-config\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:32.172663 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.172422 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9eea08ec-8c00-4216-9596-b85403a43834-service-ca\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:32.172663 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.172434 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-serving-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:32.172663 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.172449 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9eea08ec-8c00-4216-9596-b85403a43834-console-oauth-config\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:32.172663 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.172460 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4s88\" (UniqueName: \"kubernetes.io/projected/9eea08ec-8c00-4216-9596-b85403a43834-kube-api-access-f4s88\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:32.729445 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.729408 2583 generic.go:358] "Generic (PLEG): container finished" podID="9eea08ec-8c00-4216-9596-b85403a43834" containerID="6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb" exitCode=2 Apr 22 20:01:32.729935 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.729481 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59ddc6565f-t2dq6" Apr 22 20:01:32.729935 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.729482 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59ddc6565f-t2dq6" event={"ID":"9eea08ec-8c00-4216-9596-b85403a43834","Type":"ContainerDied","Data":"6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb"} Apr 22 20:01:32.729935 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.729585 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59ddc6565f-t2dq6" event={"ID":"9eea08ec-8c00-4216-9596-b85403a43834","Type":"ContainerDied","Data":"6d81cec10e48ee6d41edace0fa747b3cc88c587b3894057883c0430d4f50f1f8"} Apr 22 20:01:32.729935 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.729601 2583 scope.go:117] "RemoveContainer" containerID="6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb" Apr 22 20:01:32.737990 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.737974 2583 scope.go:117] "RemoveContainer" containerID="6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb" Apr 22 20:01:32.738221 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:32.738201 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb\": container with ID starting with 6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb not found: ID does not exist" containerID="6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb" Apr 22 20:01:32.738263 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.738230 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb"} err="failed to get container status \"6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb\": rpc error: code = NotFound desc = could not find container \"6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb\": container with ID starting with 6dcf5333b7a68325628c746046755a854d7c0ebe93eb569c624a41887fb4f7cb not found: ID does not exist" Apr 22 20:01:32.755932 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.755907 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59ddc6565f-t2dq6"] Apr 22 20:01:32.763327 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:32.763310 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59ddc6565f-t2dq6"] Apr 22 20:01:34.095632 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:34.095592 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eea08ec-8c00-4216-9596-b85403a43834" path="/var/lib/kubelet/pods/9eea08ec-8c00-4216-9596-b85403a43834/volumes" Apr 22 20:01:38.681146 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.681112 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69b784fffc-znwsh"] Apr 22 20:01:38.681573 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.681430 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eea08ec-8c00-4216-9596-b85403a43834" containerName="console" Apr 22 20:01:38.681573 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.681440 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eea08ec-8c00-4216-9596-b85403a43834" containerName="console" Apr 22 20:01:38.681573 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.681525 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9eea08ec-8c00-4216-9596-b85403a43834" containerName="console" Apr 22 20:01:38.723624 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.723589 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b784fffc-znwsh"] Apr 22 20:01:38.723774 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.723712 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.831504 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.831469 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-service-ca\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.831504 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.831506 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-config\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.831688 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.831543 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-oauth-serving-cert\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.831688 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.831574 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-trusted-ca-bundle\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.831688 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.831593 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97jb\" (UniqueName: \"kubernetes.io/projected/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-kube-api-access-t97jb\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.831688 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.831609 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-serving-cert\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.831688 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.831679 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-oauth-config\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.932456 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.932361 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-service-ca\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.932456 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.932408 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-config\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.932681 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.932467 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-oauth-serving-cert\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.932681 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.932497 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-trusted-ca-bundle\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.932681 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.932530 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t97jb\" (UniqueName: \"kubernetes.io/projected/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-kube-api-access-t97jb\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.932681 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.932555 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-serving-cert\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.932681 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.932618 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-oauth-config\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.933232 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.933198 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-service-ca\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.933362 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.933247 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-config\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.933362 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.933292 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-oauth-serving-cert\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.933561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.933543 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-trusted-ca-bundle\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.935084 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.935055 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-oauth-config\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.935274 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.935252 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-serving-cert\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:38.939771 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:38.939751 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97jb\" (UniqueName: \"kubernetes.io/projected/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-kube-api-access-t97jb\") pod \"console-69b784fffc-znwsh\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:39.032545 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:39.032505 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:39.155499 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:39.155333 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b784fffc-znwsh"] Apr 22 20:01:39.158307 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:01:39.158277 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbf8a4ae_6fb7_4b99_a716_bb0c04dc7279.slice/crio-4eff45c6b361ecd2b3973c1c195a40a6e0a310faacd39c8ed6429bce5c1c44e4 WatchSource:0}: Error finding container 4eff45c6b361ecd2b3973c1c195a40a6e0a310faacd39c8ed6429bce5c1c44e4: Status 404 returned error can't find the container with id 4eff45c6b361ecd2b3973c1c195a40a6e0a310faacd39c8ed6429bce5c1c44e4 Apr 22 20:01:39.753538 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:39.753494 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b784fffc-znwsh" event={"ID":"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279","Type":"ContainerStarted","Data":"07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c"} Apr 22 20:01:39.753538 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:39.753534 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b784fffc-znwsh" event={"ID":"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279","Type":"ContainerStarted","Data":"4eff45c6b361ecd2b3973c1c195a40a6e0a310faacd39c8ed6429bce5c1c44e4"} Apr 22 20:01:39.769756 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:39.769712 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69b784fffc-znwsh" podStartSLOduration=1.769694394 podStartE2EDuration="1.769694394s" podCreationTimestamp="2026-04-22 20:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:01:39.768756497 +0000 UTC m=+188.188663288" watchObservedRunningTime="2026-04-22 20:01:39.769694394 +0000 UTC m=+188.189601185" Apr 22 20:01:41.387902 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.387847 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:41.388312 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.388269 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="alertmanager" containerID="cri-o://bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f" gracePeriod=120 Apr 22 20:01:41.388385 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.388351 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy-metric" containerID="cri-o://29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664" gracePeriod=120 Apr 22 20:01:41.388443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.388366 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy-web" containerID="cri-o://12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122" gracePeriod=120 Apr 22 20:01:41.388443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.388396 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="config-reloader" containerID="cri-o://3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc" gracePeriod=120 Apr 22 20:01:41.388546 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.388442 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy" containerID="cri-o://c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea" gracePeriod=120 Apr 22 20:01:41.388546 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.388455 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="prom-label-proxy" containerID="cri-o://2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315" gracePeriod=120 Apr 22 20:01:41.763737 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.763705 2583 generic.go:358] "Generic (PLEG): container finished" podID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerID="2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315" exitCode=0 Apr 22 20:01:41.763737 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.763732 2583 generic.go:358] "Generic (PLEG): container finished" podID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerID="c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea" exitCode=0 Apr 22 20:01:41.763737 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.763738 2583 generic.go:358] "Generic (PLEG): container finished" podID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerID="3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc" exitCode=0 Apr 22 20:01:41.763737 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.763743 2583 generic.go:358] "Generic (PLEG): container finished" podID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerID="bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f" exitCode=0 Apr 22 20:01:41.764058 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.763782 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerDied","Data":"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315"} Apr 22 20:01:41.764058 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.763821 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerDied","Data":"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea"} Apr 22 20:01:41.764058 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.763835 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerDied","Data":"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc"} Apr 22 20:01:41.764058 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:41.763846 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerDied","Data":"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f"} Apr 22 20:01:42.519245 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:42.519208 2583 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32de9b24_31b3_4dac_9846_058bdd11cecb.slice/crio-conmon-29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664.scope\": RecentStats: unable to find data in memory cache]" Apr 22 20:01:42.626316 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.626291 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:42.766064 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.765962 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-metrics-client-ca\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766064 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766013 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-trusted-ca-bundle\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766064 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766045 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766415 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766073 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-web\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766415 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766092 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766415 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766121 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-web-config\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766415 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766143 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-578wg\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-kube-api-access-578wg\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766485 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:42.766618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766551 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-config-volume\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766583 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766581 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:42.766823 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766652 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-main-db\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766823 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766685 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-cluster-tls-config\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766823 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766720 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-config-out\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.766823 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.766766 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-tls-assets\") pod \"32de9b24-31b3-4dac-9846-058bdd11cecb\" (UID: \"32de9b24-31b3-4dac-9846-058bdd11cecb\") " Apr 22 20:01:42.767175 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.767154 2583 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-metrics-client-ca\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.767466 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.767185 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.769132 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.767709 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:01:42.769685 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.769635 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:42.770433 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.770405 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-kube-api-access-578wg" (OuterVolumeSpecName: "kube-api-access-578wg") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "kube-api-access-578wg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:42.770537 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.770433 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:42.770620 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.770593 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:42.770701 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.770681 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:42.770996 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.770977 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:42.771263 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.771243 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-config-out" (OuterVolumeSpecName: "config-out") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:01:42.771652 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.771595 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-config-volume" (OuterVolumeSpecName: "config-volume") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:42.772087 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.772065 2583 generic.go:358] "Generic (PLEG): container finished" podID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerID="29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664" exitCode=0 Apr 22 20:01:42.772087 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.772086 2583 generic.go:358] "Generic (PLEG): container finished" podID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerID="12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122" exitCode=0 Apr 22 20:01:42.772232 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.772154 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerDied","Data":"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664"} Apr 22 20:01:42.772232 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.772196 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerDied","Data":"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122"} Apr 22 20:01:42.772232 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.772207 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:42.772232 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.772215 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"32de9b24-31b3-4dac-9846-058bdd11cecb","Type":"ContainerDied","Data":"2275f85a50abd4def02dbbe60ef9ee502f3d9621fc56ec18708adc721dc7faaf"} Apr 22 20:01:42.772425 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.772237 2583 scope.go:117] "RemoveContainer" containerID="2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315" Apr 22 20:01:42.775714 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.775694 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:42.786770 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.786741 2583 scope.go:117] "RemoveContainer" containerID="29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664" Apr 22 20:01:42.788686 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.788666 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-web-config" (OuterVolumeSpecName: "web-config") pod "32de9b24-31b3-4dac-9846-058bdd11cecb" (UID: "32de9b24-31b3-4dac-9846-058bdd11cecb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:42.793546 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.793524 2583 scope.go:117] "RemoveContainer" containerID="c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea" Apr 22 20:01:42.799962 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.799947 2583 scope.go:117] "RemoveContainer" containerID="12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122" Apr 22 20:01:42.807677 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.807650 2583 scope.go:117] "RemoveContainer" containerID="3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc" Apr 22 20:01:42.814305 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.814287 2583 scope.go:117] "RemoveContainer" containerID="bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f" Apr 22 20:01:42.820665 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.820646 2583 scope.go:117] "RemoveContainer" containerID="954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f" Apr 22 20:01:42.827053 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.827032 2583 scope.go:117] "RemoveContainer" containerID="2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315" Apr 22 20:01:42.827285 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:42.827266 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315\": container with ID starting with 2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315 not found: ID does not exist" containerID="2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315" Apr 22 20:01:42.827345 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.827292 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315"} err="failed to get container status \"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315\": rpc error: code = NotFound desc = could not find container \"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315\": container with ID starting with 2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315 not found: ID does not exist" Apr 22 20:01:42.827345 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.827309 2583 scope.go:117] "RemoveContainer" containerID="29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664" Apr 22 20:01:42.827545 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:42.827526 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664\": container with ID starting with 29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664 not found: ID does not exist" containerID="29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664" Apr 22 20:01:42.827612 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.827554 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664"} err="failed to get container status \"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664\": rpc error: code = NotFound desc = could not find container \"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664\": container with ID starting with 29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664 not found: ID does not exist" Apr 22 20:01:42.827612 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.827577 2583 scope.go:117] "RemoveContainer" containerID="c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea" Apr 22 20:01:42.827803 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:42.827786 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea\": container with ID starting with c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea not found: ID does not exist" containerID="c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea" Apr 22 20:01:42.827844 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.827807 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea"} err="failed to get container status \"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea\": rpc error: code = NotFound desc = could not find container \"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea\": container with ID starting with c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea not found: ID does not exist" Apr 22 20:01:42.827844 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.827821 2583 scope.go:117] "RemoveContainer" containerID="12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122" Apr 22 20:01:42.828205 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:42.828190 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122\": container with ID starting with 12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122 not found: ID does not exist" containerID="12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122" Apr 22 20:01:42.828268 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.828209 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122"} err="failed to get container status \"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122\": rpc error: code = NotFound desc = could not find container \"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122\": container with ID starting with 12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122 not found: ID does not exist" Apr 22 20:01:42.828268 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.828221 2583 scope.go:117] "RemoveContainer" containerID="3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc" Apr 22 20:01:42.828440 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:42.828423 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc\": container with ID starting with 3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc not found: ID does not exist" containerID="3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc" Apr 22 20:01:42.828514 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.828448 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc"} err="failed to get container status \"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc\": rpc error: code = NotFound desc = could not find container \"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc\": container with ID starting with 3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc not found: ID does not exist" Apr 22 20:01:42.828514 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.828467 2583 scope.go:117] "RemoveContainer" containerID="bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f" Apr 22 20:01:42.828692 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:42.828674 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f\": container with ID starting with bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f not found: ID does not exist" containerID="bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f" Apr 22 20:01:42.828731 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.828696 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f"} err="failed to get container status \"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f\": rpc error: code = NotFound desc = could not find container \"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f\": container with ID starting with bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f not found: ID does not exist" Apr 22 20:01:42.828731 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.828709 2583 scope.go:117] "RemoveContainer" containerID="954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f" Apr 22 20:01:42.828886 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:01:42.828852 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f\": container with ID starting with 954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f not found: ID does not exist" containerID="954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f" Apr 22 20:01:42.828925 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.828889 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f"} err="failed to get container status \"954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f\": rpc error: code = NotFound desc = could not find container \"954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f\": container with ID starting with 954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f not found: ID does not exist" Apr 22 20:01:42.828925 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.828901 2583 scope.go:117] "RemoveContainer" containerID="2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315" Apr 22 20:01:42.829095 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829079 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315"} err="failed to get container status \"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315\": rpc error: code = NotFound desc = could not find container \"2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315\": container with ID starting with 2dfe4f16d189f44fb955a7cd6d0aee51243b90be1a7c27d2cd6f6dcb23dd4315 not found: ID does not exist" Apr 22 20:01:42.829149 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829095 2583 scope.go:117] "RemoveContainer" containerID="29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664" Apr 22 20:01:42.829301 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829279 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664"} err="failed to get container status \"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664\": rpc error: code = NotFound desc = could not find container \"29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664\": container with ID starting with 29785c0b25364420695fe291ec73af6226b3029c0f8c303a9622014fca74a664 not found: ID does not exist" Apr 22 20:01:42.829301 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829299 2583 scope.go:117] "RemoveContainer" containerID="c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea" Apr 22 20:01:42.829518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829499 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea"} err="failed to get container status \"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea\": rpc error: code = NotFound desc = could not find container \"c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea\": container with ID starting with c0160dfc15a9a9b6e03b2f3be1bfeb89e0ba9f047d049fac7bb0386b41ab1fea not found: ID does not exist" Apr 22 20:01:42.829518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829519 2583 scope.go:117] "RemoveContainer" containerID="12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122" Apr 22 20:01:42.829722 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829699 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122"} err="failed to get container status \"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122\": rpc error: code = NotFound desc = could not find container \"12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122\": container with ID starting with 12bd67ea6a7c98ec5649d0dd87b16c3fe84608f8b93702d441e219448ebf6122 not found: ID does not exist" Apr 22 20:01:42.829768 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829724 2583 scope.go:117] "RemoveContainer" containerID="3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc" Apr 22 20:01:42.829964 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829948 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc"} err="failed to get container status \"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc\": rpc error: code = NotFound desc = could not find container \"3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc\": container with ID starting with 3f87804b23022b6c58f21eed64282fe78d83a521da48f2e51eeb3b5cdf7333fc not found: ID does not exist" Apr 22 20:01:42.830019 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.829964 2583 scope.go:117] "RemoveContainer" containerID="bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f" Apr 22 20:01:42.830174 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.830158 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f"} err="failed to get container status \"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f\": rpc error: code = NotFound desc = could not find container \"bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f\": container with ID starting with bb7e5c84a2a57f93790972bf49447e24f6bc6b0608e1050b6bf0ef396deab25f not found: ID does not exist" Apr 22 20:01:42.830214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.830175 2583 scope.go:117] "RemoveContainer" containerID="954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f" Apr 22 20:01:42.830391 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.830373 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f"} err="failed to get container status \"954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f\": rpc error: code = NotFound desc = could not find container \"954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f\": container with ID starting with 954fe8d25e8a4f9483159df96395e7d6f4a52d74fd2f198ac25cf071ef73465f not found: ID does not exist" Apr 22 20:01:42.868166 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868137 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-config-out\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868166 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868162 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-tls-assets\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868166 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868172 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868182 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868194 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868203 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-web-config\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868212 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-578wg\" (UniqueName: \"kubernetes.io/projected/32de9b24-31b3-4dac-9846-058bdd11cecb-kube-api-access-578wg\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868220 2583 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-config-volume\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868228 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-secret-alertmanager-main-tls\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868237 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/32de9b24-31b3-4dac-9846-058bdd11cecb-alertmanager-main-db\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:42.868375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:42.868245 2583 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/32de9b24-31b3-4dac-9846-058bdd11cecb-cluster-tls-config\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:01:43.095887 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.095837 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:43.101379 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.101349 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:43.144629 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.144601 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:43.145063 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145045 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy-web" Apr 22 20:01:43.145115 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145065 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy-web" Apr 22 20:01:43.145115 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145076 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy-metric" Apr 22 20:01:43.145115 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145081 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy-metric" Apr 22 20:01:43.145115 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145088 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="alertmanager" Apr 22 20:01:43.145115 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145093 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="alertmanager" Apr 22 20:01:43.145115 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145099 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="config-reloader" Apr 22 20:01:43.145115 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145104 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="config-reloader" Apr 22 20:01:43.145115 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145116 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145122 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145130 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="init-config-reloader" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145136 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="init-config-reloader" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145149 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="prom-label-proxy" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145155 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="prom-label-proxy" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145198 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy-metric" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145208 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="prom-label-proxy" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145213 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145219 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="config-reloader" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145226 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="alertmanager" Apr 22 20:01:43.145347 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.145233 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" containerName="kube-rbac-proxy-web" Apr 22 20:01:43.148296 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.148279 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.150691 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.150665 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 20:01:43.150839 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.150818 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 20:01:43.152459 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.152444 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 20:01:43.152681 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.152668 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 20:01:43.152763 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.152705 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 20:01:43.152980 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.152967 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-szjbv\"" Apr 22 20:01:43.153062 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.152986 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 20:01:43.153123 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.153056 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 20:01:43.153285 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.153267 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 20:01:43.168228 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.168211 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 20:01:43.172244 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.172226 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:43.271067 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271030 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5fc053ca-15dc-43bd-ba88-94d8d38038c6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271083 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271110 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jmld\" (UniqueName: \"kubernetes.io/projected/5fc053ca-15dc-43bd-ba88-94d8d38038c6-kube-api-access-8jmld\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271181 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fc053ca-15dc-43bd-ba88-94d8d38038c6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271220 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5fc053ca-15dc-43bd-ba88-94d8d38038c6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271238 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271254 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271337 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fc053ca-15dc-43bd-ba88-94d8d38038c6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271386 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-config-volume\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271420 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271440 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-web-config\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271458 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5fc053ca-15dc-43bd-ba88-94d8d38038c6-config-out\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.271480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.271473 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372265 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372178 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372265 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372216 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jmld\" (UniqueName: \"kubernetes.io/projected/5fc053ca-15dc-43bd-ba88-94d8d38038c6-kube-api-access-8jmld\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372265 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372244 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fc053ca-15dc-43bd-ba88-94d8d38038c6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372265 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372262 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5fc053ca-15dc-43bd-ba88-94d8d38038c6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372284 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372309 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372351 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fc053ca-15dc-43bd-ba88-94d8d38038c6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372383 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-config-volume\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372410 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372431 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-web-config\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372454 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5fc053ca-15dc-43bd-ba88-94d8d38038c6-config-out\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372478 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.372572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372526 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5fc053ca-15dc-43bd-ba88-94d8d38038c6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.373109 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.372912 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5fc053ca-15dc-43bd-ba88-94d8d38038c6-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.373474 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.373421 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fc053ca-15dc-43bd-ba88-94d8d38038c6-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.374045 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.373989 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fc053ca-15dc-43bd-ba88-94d8d38038c6-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.375828 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.375681 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.375941 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.375832 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5fc053ca-15dc-43bd-ba88-94d8d38038c6-config-out\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.376019 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.375956 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5fc053ca-15dc-43bd-ba88-94d8d38038c6-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.376088 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.376044 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.376143 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.376107 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.376438 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.376417 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-web-config\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.376500 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.376448 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.376603 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.376585 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.377230 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.377213 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5fc053ca-15dc-43bd-ba88-94d8d38038c6-config-volume\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.379443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.379428 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jmld\" (UniqueName: \"kubernetes.io/projected/5fc053ca-15dc-43bd-ba88-94d8d38038c6-kube-api-access-8jmld\") pod \"alertmanager-main-0\" (UID: \"5fc053ca-15dc-43bd-ba88-94d8d38038c6\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.457799 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.457766 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:01:43.585993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.585971 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:01:43.588306 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:01:43.588280 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fc053ca_15dc_43bd_ba88_94d8d38038c6.slice/crio-e07c92de9c105ebc5f2d93532462ed3bcd519d1e67de5526ac42e7798ca1ae32 WatchSource:0}: Error finding container e07c92de9c105ebc5f2d93532462ed3bcd519d1e67de5526ac42e7798ca1ae32: Status 404 returned error can't find the container with id e07c92de9c105ebc5f2d93532462ed3bcd519d1e67de5526ac42e7798ca1ae32 Apr 22 20:01:43.780130 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.780098 2583 generic.go:358] "Generic (PLEG): container finished" podID="5fc053ca-15dc-43bd-ba88-94d8d38038c6" containerID="3602e1819ddf62d2a5d794bbea2fe17828db4a408352f3a83b5f3c7bc052640d" exitCode=0 Apr 22 20:01:43.780287 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.780186 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5fc053ca-15dc-43bd-ba88-94d8d38038c6","Type":"ContainerDied","Data":"3602e1819ddf62d2a5d794bbea2fe17828db4a408352f3a83b5f3c7bc052640d"} Apr 22 20:01:43.780287 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:43.780220 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5fc053ca-15dc-43bd-ba88-94d8d38038c6","Type":"ContainerStarted","Data":"e07c92de9c105ebc5f2d93532462ed3bcd519d1e67de5526ac42e7798ca1ae32"} Apr 22 20:01:44.098265 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:44.098237 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32de9b24-31b3-4dac-9846-058bdd11cecb" path="/var/lib/kubelet/pods/32de9b24-31b3-4dac-9846-058bdd11cecb/volumes" Apr 22 20:01:44.787043 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:44.787001 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5fc053ca-15dc-43bd-ba88-94d8d38038c6","Type":"ContainerStarted","Data":"02989dc21628fd49fa3385b4c47c15270ba6261410201a22e81150eaa62e449a"} Apr 22 20:01:44.787043 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:44.787041 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5fc053ca-15dc-43bd-ba88-94d8d38038c6","Type":"ContainerStarted","Data":"538a3a62828f4d538017c201de676cd577446a0244d3d2d00957075fe1e23229"} Apr 22 20:01:44.787043 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:44.787052 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5fc053ca-15dc-43bd-ba88-94d8d38038c6","Type":"ContainerStarted","Data":"911194562bc76336aa86789178fdcbed8e4eccd24c7ad8dd2da0e9c227a4edaa"} Apr 22 20:01:44.787515 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:44.787063 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5fc053ca-15dc-43bd-ba88-94d8d38038c6","Type":"ContainerStarted","Data":"5e1c963479f11736de708f68425f668dffdaffc3be030bc5bdf4b1fb67219fa0"} Apr 22 20:01:44.787515 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:44.787073 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5fc053ca-15dc-43bd-ba88-94d8d38038c6","Type":"ContainerStarted","Data":"12e819ba6fe401dc5a85037de50ea9e02385f3f5cfb43cebc2f4e17c47739e49"} Apr 22 20:01:44.787515 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:44.787083 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5fc053ca-15dc-43bd-ba88-94d8d38038c6","Type":"ContainerStarted","Data":"3619b1d1b354641c7a0a3a48adba3c787ab459d29d05408ed04295926a41ef28"} Apr 22 20:01:44.812807 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:44.812762 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.812742934 podStartE2EDuration="1.812742934s" podCreationTimestamp="2026-04-22 20:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:01:44.811885574 +0000 UTC m=+193.231792362" watchObservedRunningTime="2026-04-22 20:01:44.812742934 +0000 UTC m=+193.232649725" Apr 22 20:01:49.033414 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:49.033381 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:49.033414 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:49.033425 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:49.038440 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:49.038417 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:49.808254 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:49.808226 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:01:49.855533 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:01:49.855503 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f5b686dbd-7w942"] Apr 22 20:02:14.875093 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:14.875031 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f5b686dbd-7w942" podUID="9147f1c1-a7c3-4865-89aa-aa7083bb2032" containerName="console" containerID="cri-o://0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4" gracePeriod=15 Apr 22 20:02:15.120371 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.120350 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5b686dbd-7w942_9147f1c1-a7c3-4865-89aa-aa7083bb2032/console/0.log" Apr 22 20:02:15.120487 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.120409 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:02:15.140188 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140099 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-oauth-serving-cert\") pod \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " Apr 22 20:02:15.140188 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140151 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-trusted-ca-bundle\") pod \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " Apr 22 20:02:15.140188 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140187 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-serving-cert\") pod \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " Apr 22 20:02:15.140593 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140235 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55zw2\" (UniqueName: \"kubernetes.io/projected/9147f1c1-a7c3-4865-89aa-aa7083bb2032-kube-api-access-55zw2\") pod \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " Apr 22 20:02:15.140593 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140275 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-oauth-config\") pod \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " Apr 22 20:02:15.140593 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140311 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-service-ca\") pod \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " Apr 22 20:02:15.140593 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140374 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-config\") pod \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\" (UID: \"9147f1c1-a7c3-4865-89aa-aa7083bb2032\") " Apr 22 20:02:15.140593 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140575 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9147f1c1-a7c3-4865-89aa-aa7083bb2032" (UID: "9147f1c1-a7c3-4865-89aa-aa7083bb2032"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:02:15.140918 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.140618 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9147f1c1-a7c3-4865-89aa-aa7083bb2032" (UID: "9147f1c1-a7c3-4865-89aa-aa7083bb2032"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:02:15.141324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.141286 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-service-ca" (OuterVolumeSpecName: "service-ca") pod "9147f1c1-a7c3-4865-89aa-aa7083bb2032" (UID: "9147f1c1-a7c3-4865-89aa-aa7083bb2032"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:02:15.141324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.141297 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-config" (OuterVolumeSpecName: "console-config") pod "9147f1c1-a7c3-4865-89aa-aa7083bb2032" (UID: "9147f1c1-a7c3-4865-89aa-aa7083bb2032"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:02:15.143167 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.143126 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9147f1c1-a7c3-4865-89aa-aa7083bb2032-kube-api-access-55zw2" (OuterVolumeSpecName: "kube-api-access-55zw2") pod "9147f1c1-a7c3-4865-89aa-aa7083bb2032" (UID: "9147f1c1-a7c3-4865-89aa-aa7083bb2032"). InnerVolumeSpecName "kube-api-access-55zw2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:02:15.143279 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.143214 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9147f1c1-a7c3-4865-89aa-aa7083bb2032" (UID: "9147f1c1-a7c3-4865-89aa-aa7083bb2032"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:02:15.143396 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.143370 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9147f1c1-a7c3-4865-89aa-aa7083bb2032" (UID: "9147f1c1-a7c3-4865-89aa-aa7083bb2032"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:02:15.241815 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.241786 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-config\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:02:15.241815 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.241810 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-oauth-serving-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:02:15.241815 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.241820 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-trusted-ca-bundle\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:02:15.242047 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.241830 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-serving-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:02:15.242047 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.241839 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-55zw2\" (UniqueName: \"kubernetes.io/projected/9147f1c1-a7c3-4865-89aa-aa7083bb2032-kube-api-access-55zw2\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:02:15.242047 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.241847 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9147f1c1-a7c3-4865-89aa-aa7083bb2032-console-oauth-config\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:02:15.242047 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.241857 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9147f1c1-a7c3-4865-89aa-aa7083bb2032-service-ca\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:02:15.889395 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.889369 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5b686dbd-7w942_9147f1c1-a7c3-4865-89aa-aa7083bb2032/console/0.log" Apr 22 20:02:15.889794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.889408 2583 generic.go:358] "Generic (PLEG): container finished" podID="9147f1c1-a7c3-4865-89aa-aa7083bb2032" containerID="0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4" exitCode=2 Apr 22 20:02:15.889794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.889458 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5b686dbd-7w942" event={"ID":"9147f1c1-a7c3-4865-89aa-aa7083bb2032","Type":"ContainerDied","Data":"0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4"} Apr 22 20:02:15.889794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.889493 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5b686dbd-7w942" event={"ID":"9147f1c1-a7c3-4865-89aa-aa7083bb2032","Type":"ContainerDied","Data":"5bf4a8b3c5f56fd31cf632798b9f225920ca133d71a9e84751fbb01bb785f0de"} Apr 22 20:02:15.889794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.889508 2583 scope.go:117] "RemoveContainer" containerID="0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4" Apr 22 20:02:15.889794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.889469 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5b686dbd-7w942" Apr 22 20:02:15.898054 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.898035 2583 scope.go:117] "RemoveContainer" containerID="0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4" Apr 22 20:02:15.898313 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:02:15.898295 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4\": container with ID starting with 0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4 not found: ID does not exist" containerID="0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4" Apr 22 20:02:15.898351 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.898322 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4"} err="failed to get container status \"0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4\": rpc error: code = NotFound desc = could not find container \"0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4\": container with ID starting with 0789e8a4ed46c8a19a26b1286bafea456be44be9fa674073b4dae99bf7f133b4 not found: ID does not exist" Apr 22 20:02:15.908975 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.908954 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f5b686dbd-7w942"] Apr 22 20:02:15.914337 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:15.914301 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f5b686dbd-7w942"] Apr 22 20:02:16.096104 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:16.096076 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9147f1c1-a7c3-4865-89aa-aa7083bb2032" path="/var/lib/kubelet/pods/9147f1c1-a7c3-4865-89aa-aa7083bb2032/volumes" Apr 22 20:02:30.190857 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.190826 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dp22p"] Apr 22 20:02:30.191316 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.191173 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9147f1c1-a7c3-4865-89aa-aa7083bb2032" containerName="console" Apr 22 20:02:30.191316 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.191184 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9147f1c1-a7c3-4865-89aa-aa7083bb2032" containerName="console" Apr 22 20:02:30.191316 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.191247 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9147f1c1-a7c3-4865-89aa-aa7083bb2032" containerName="console" Apr 22 20:02:30.193108 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.193092 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.195177 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.195149 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 20:02:30.208832 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.208796 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dp22p"] Apr 22 20:02:30.266290 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.266262 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0611db5-4075-4c63-8413-0bc07ce6cf5d-original-pull-secret\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.266457 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.266313 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0611db5-4075-4c63-8413-0bc07ce6cf5d-kubelet-config\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.266457 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.266366 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0611db5-4075-4c63-8413-0bc07ce6cf5d-dbus\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.367206 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.367160 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0611db5-4075-4c63-8413-0bc07ce6cf5d-kubelet-config\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.367453 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.367217 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0611db5-4075-4c63-8413-0bc07ce6cf5d-dbus\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.367453 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.367290 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0611db5-4075-4c63-8413-0bc07ce6cf5d-original-pull-secret\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.367453 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.367286 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0611db5-4075-4c63-8413-0bc07ce6cf5d-kubelet-config\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.367453 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.367359 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0611db5-4075-4c63-8413-0bc07ce6cf5d-dbus\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.369717 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.369686 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0611db5-4075-4c63-8413-0bc07ce6cf5d-original-pull-secret\") pod \"global-pull-secret-syncer-dp22p\" (UID: \"d0611db5-4075-4c63-8413-0bc07ce6cf5d\") " pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.507142 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.507056 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dp22p" Apr 22 20:02:30.625275 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.625243 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dp22p"] Apr 22 20:02:30.628855 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:02:30.628816 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0611db5_4075_4c63_8413_0bc07ce6cf5d.slice/crio-f6a4ebff4ef38006431bce8188910eb45b21da25fc5ff7f56f00b690a1d2fec2 WatchSource:0}: Error finding container f6a4ebff4ef38006431bce8188910eb45b21da25fc5ff7f56f00b690a1d2fec2: Status 404 returned error can't find the container with id f6a4ebff4ef38006431bce8188910eb45b21da25fc5ff7f56f00b690a1d2fec2 Apr 22 20:02:30.946437 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:30.946398 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dp22p" event={"ID":"d0611db5-4075-4c63-8413-0bc07ce6cf5d","Type":"ContainerStarted","Data":"f6a4ebff4ef38006431bce8188910eb45b21da25fc5ff7f56f00b690a1d2fec2"} Apr 22 20:02:34.962205 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:34.962168 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dp22p" event={"ID":"d0611db5-4075-4c63-8413-0bc07ce6cf5d","Type":"ContainerStarted","Data":"c6670518cfa04d9b26ec0d733f291a48ad1ee9e646413cf1c82e40f112c6a512"} Apr 22 20:02:34.979257 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:34.979213 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dp22p" podStartSLOduration=1.062864186 podStartE2EDuration="4.979200219s" podCreationTimestamp="2026-04-22 20:02:30 +0000 UTC" firstStartedPulling="2026-04-22 20:02:30.630438654 +0000 UTC m=+239.050345423" lastFinishedPulling="2026-04-22 20:02:34.54677467 +0000 UTC m=+242.966681456" observedRunningTime="2026-04-22 20:02:34.97756946 +0000 UTC m=+243.397476264" watchObservedRunningTime="2026-04-22 20:02:34.979200219 +0000 UTC m=+243.399107010" Apr 22 20:02:53.573887 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.573773 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49"] Apr 22 20:02:53.575987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.575956 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.578156 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.578138 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 20:02:53.578269 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.578163 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 20:02:53.578269 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.578194 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qtm6b\"" Apr 22 20:02:53.584759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.584734 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49"] Apr 22 20:02:53.767518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.767487 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.767720 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.767528 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzqdl\" (UniqueName: \"kubernetes.io/projected/e6da7add-e4d9-46fa-921b-ead3592d653c-kube-api-access-fzqdl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.767720 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.767609 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.868106 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.868015 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.868106 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.868097 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.868338 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.868121 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzqdl\" (UniqueName: \"kubernetes.io/projected/e6da7add-e4d9-46fa-921b-ead3592d653c-kube-api-access-fzqdl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.868463 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.868440 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.868528 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.868460 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:53.888395 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:53.888369 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzqdl\" (UniqueName: \"kubernetes.io/projected/e6da7add-e4d9-46fa-921b-ead3592d653c-kube-api-access-fzqdl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:54.186092 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:54.186057 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:02:54.306459 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:54.306433 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49"] Apr 22 20:02:54.308477 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:02:54.308454 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6da7add_e4d9_46fa_921b_ead3592d653c.slice/crio-8d7e2e96c3b21e8872a120bc7602276b788c5f2c1e5059b9f65f2202e6cc5242 WatchSource:0}: Error finding container 8d7e2e96c3b21e8872a120bc7602276b788c5f2c1e5059b9f65f2202e6cc5242: Status 404 returned error can't find the container with id 8d7e2e96c3b21e8872a120bc7602276b788c5f2c1e5059b9f65f2202e6cc5242 Apr 22 20:02:55.022100 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:02:55.022045 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" event={"ID":"e6da7add-e4d9-46fa-921b-ead3592d653c","Type":"ContainerStarted","Data":"8d7e2e96c3b21e8872a120bc7602276b788c5f2c1e5059b9f65f2202e6cc5242"} Apr 22 20:03:00.042489 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:00.042399 2583 generic.go:358] "Generic (PLEG): container finished" podID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerID="de3bdf658b2847407d3ed8c606ddd4d50111085afbdef96b4caf975e702452a3" exitCode=0 Apr 22 20:03:00.042489 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:00.042449 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" event={"ID":"e6da7add-e4d9-46fa-921b-ead3592d653c","Type":"ContainerDied","Data":"de3bdf658b2847407d3ed8c606ddd4d50111085afbdef96b4caf975e702452a3"} Apr 22 20:03:03.054188 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:03.054150 2583 generic.go:358] "Generic (PLEG): container finished" podID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerID="ee0a49cd181a7a00caec63c4287391dee6adbaf343eb161a71fdcc8a54d1df90" exitCode=0 Apr 22 20:03:03.054568 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:03.054236 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" event={"ID":"e6da7add-e4d9-46fa-921b-ead3592d653c","Type":"ContainerDied","Data":"ee0a49cd181a7a00caec63c4287391dee6adbaf343eb161a71fdcc8a54d1df90"} Apr 22 20:03:11.081468 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:11.081427 2583 generic.go:358] "Generic (PLEG): container finished" podID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerID="b96f4de512fdc39720c91358da72f7e8bd1602e3c6a04a3f915be36b657ae00f" exitCode=0 Apr 22 20:03:11.081914 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:11.081543 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" event={"ID":"e6da7add-e4d9-46fa-921b-ead3592d653c","Type":"ContainerDied","Data":"b96f4de512fdc39720c91358da72f7e8bd1602e3c6a04a3f915be36b657ae00f"} Apr 22 20:03:12.222814 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.222787 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:03:12.306852 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.306820 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-util\") pod \"e6da7add-e4d9-46fa-921b-ead3592d653c\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " Apr 22 20:03:12.307054 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.306891 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzqdl\" (UniqueName: \"kubernetes.io/projected/e6da7add-e4d9-46fa-921b-ead3592d653c-kube-api-access-fzqdl\") pod \"e6da7add-e4d9-46fa-921b-ead3592d653c\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " Apr 22 20:03:12.307054 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.306955 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-bundle\") pod \"e6da7add-e4d9-46fa-921b-ead3592d653c\" (UID: \"e6da7add-e4d9-46fa-921b-ead3592d653c\") " Apr 22 20:03:12.307519 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.307492 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-bundle" (OuterVolumeSpecName: "bundle") pod "e6da7add-e4d9-46fa-921b-ead3592d653c" (UID: "e6da7add-e4d9-46fa-921b-ead3592d653c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:03:12.309270 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.309244 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6da7add-e4d9-46fa-921b-ead3592d653c-kube-api-access-fzqdl" (OuterVolumeSpecName: "kube-api-access-fzqdl") pod "e6da7add-e4d9-46fa-921b-ead3592d653c" (UID: "e6da7add-e4d9-46fa-921b-ead3592d653c"). InnerVolumeSpecName "kube-api-access-fzqdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:03:12.310930 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.310902 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-util" (OuterVolumeSpecName: "util") pod "e6da7add-e4d9-46fa-921b-ead3592d653c" (UID: "e6da7add-e4d9-46fa-921b-ead3592d653c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:03:12.408266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.408181 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-util\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:03:12.408266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.408210 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fzqdl\" (UniqueName: \"kubernetes.io/projected/e6da7add-e4d9-46fa-921b-ead3592d653c-kube-api-access-fzqdl\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:03:12.408266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:12.408222 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6da7add-e4d9-46fa-921b-ead3592d653c-bundle\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:03:13.089750 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:13.089715 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" event={"ID":"e6da7add-e4d9-46fa-921b-ead3592d653c","Type":"ContainerDied","Data":"8d7e2e96c3b21e8872a120bc7602276b788c5f2c1e5059b9f65f2202e6cc5242"} Apr 22 20:03:13.089750 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:13.089753 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d7e2e96c3b21e8872a120bc7602276b788c5f2c1e5059b9f65f2202e6cc5242" Apr 22 20:03:13.089992 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:13.089727 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cv2j49" Apr 22 20:03:15.299448 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.299406 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5"] Apr 22 20:03:15.299926 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.299908 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerName="extract" Apr 22 20:03:15.299968 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.299929 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerName="extract" Apr 22 20:03:15.300005 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.299966 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerName="util" Apr 22 20:03:15.300005 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.299974 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerName="util" Apr 22 20:03:15.300005 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.299994 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerName="pull" Apr 22 20:03:15.300005 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.300000 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerName="pull" Apr 22 20:03:15.300120 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.300058 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6da7add-e4d9-46fa-921b-ead3592d653c" containerName="extract" Apr 22 20:03:15.302375 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.302361 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:15.304816 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.304792 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 20:03:15.304816 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.304794 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 20:03:15.305024 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.304796 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-hct4j\"" Apr 22 20:03:15.305024 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.304794 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 20:03:15.329149 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.329127 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5"] Apr 22 20:03:15.330898 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.330881 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9bf386b4-ed1c-481e-b90b-70444e3896d2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-knbc5\" (UID: \"9bf386b4-ed1c-481e-b90b-70444e3896d2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:15.331007 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.330988 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmrp\" (UniqueName: \"kubernetes.io/projected/9bf386b4-ed1c-481e-b90b-70444e3896d2-kube-api-access-jwmrp\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-knbc5\" (UID: \"9bf386b4-ed1c-481e-b90b-70444e3896d2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:15.431518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.431477 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmrp\" (UniqueName: \"kubernetes.io/projected/9bf386b4-ed1c-481e-b90b-70444e3896d2-kube-api-access-jwmrp\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-knbc5\" (UID: \"9bf386b4-ed1c-481e-b90b-70444e3896d2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:15.431703 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.431530 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9bf386b4-ed1c-481e-b90b-70444e3896d2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-knbc5\" (UID: \"9bf386b4-ed1c-481e-b90b-70444e3896d2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:15.433890 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.433847 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/9bf386b4-ed1c-481e-b90b-70444e3896d2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-knbc5\" (UID: \"9bf386b4-ed1c-481e-b90b-70444e3896d2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:15.444553 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.444522 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmrp\" (UniqueName: \"kubernetes.io/projected/9bf386b4-ed1c-481e-b90b-70444e3896d2-kube-api-access-jwmrp\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-knbc5\" (UID: \"9bf386b4-ed1c-481e-b90b-70444e3896d2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:15.612611 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.612521 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:15.736472 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:15.736437 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5"] Apr 22 20:03:15.742075 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:03:15.742048 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bf386b4_ed1c_481e_b90b_70444e3896d2.slice/crio-cd2a300a42341a87062bf7bf934eba62599583755f7eafcc2e3949c4cb42d5d4 WatchSource:0}: Error finding container cd2a300a42341a87062bf7bf934eba62599583755f7eafcc2e3949c4cb42d5d4: Status 404 returned error can't find the container with id cd2a300a42341a87062bf7bf934eba62599583755f7eafcc2e3949c4cb42d5d4 Apr 22 20:03:16.100467 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:16.100429 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" event={"ID":"9bf386b4-ed1c-481e-b90b-70444e3896d2","Type":"ContainerStarted","Data":"cd2a300a42341a87062bf7bf934eba62599583755f7eafcc2e3949c4cb42d5d4"} Apr 22 20:03:20.115030 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.114938 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" event={"ID":"9bf386b4-ed1c-481e-b90b-70444e3896d2","Type":"ContainerStarted","Data":"11b2ed2e335ba280b78744c5aaab6a4298367a0d5a27e7c4907e95e7a0a5c4ba"} Apr 22 20:03:20.115415 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.115075 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:20.137501 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.137451 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" podStartSLOduration=1.090486581 podStartE2EDuration="5.137438653s" podCreationTimestamp="2026-04-22 20:03:15 +0000 UTC" firstStartedPulling="2026-04-22 20:03:15.743736126 +0000 UTC m=+284.163642896" lastFinishedPulling="2026-04-22 20:03:19.790688197 +0000 UTC m=+288.210594968" observedRunningTime="2026-04-22 20:03:20.136586153 +0000 UTC m=+288.556492959" watchObservedRunningTime="2026-04-22 20:03:20.137438653 +0000 UTC m=+288.557345443" Apr 22 20:03:20.665414 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.665380 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7"] Apr 22 20:03:20.667612 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.667595 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:20.670222 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.670199 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 20:03:20.670892 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.670853 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 20:03:20.671059 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.671042 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-7rtsm\"" Apr 22 20:03:20.682513 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.682489 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7"] Apr 22 20:03:20.780360 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.780323 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpfcp\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-kube-api-access-kpfcp\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:20.780547 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.780365 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f1de602c-ced7-49fc-bb79-fb4087c1d85e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:20.780547 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.780402 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:20.881308 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.881264 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f1de602c-ced7-49fc-bb79-fb4087c1d85e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:20.881308 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.881312 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:20.881571 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.881411 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpfcp\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-kube-api-access-kpfcp\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:20.881571 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:20.881489 2583 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:03:20.881571 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:20.881509 2583 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:03:20.881571 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:20.881525 2583 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 22 20:03:20.881571 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:20.881543 2583 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 20:03:20.881806 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:20.881598 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates podName:f1de602c-ced7-49fc-bb79-fb4087c1d85e nodeName:}" failed. No retries permitted until 2026-04-22 20:03:21.381579341 +0000 UTC m=+289.801486126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates") pod "keda-metrics-apiserver-7c9f485588-cnjj7" (UID: "f1de602c-ced7-49fc-bb79-fb4087c1d85e") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 22 20:03:20.881806 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.881646 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/f1de602c-ced7-49fc-bb79-fb4087c1d85e-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:20.891610 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:20.891588 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpfcp\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-kube-api-access-kpfcp\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:21.385541 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:21.385510 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:21.385943 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:21.385647 2583 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:03:21.385943 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:21.385666 2583 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:03:21.385943 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:21.385685 2583 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7: references non-existent secret key: tls.crt Apr 22 20:03:21.385943 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:21.385739 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates podName:f1de602c-ced7-49fc-bb79-fb4087c1d85e nodeName:}" failed. No retries permitted until 2026-04-22 20:03:22.385724177 +0000 UTC m=+290.805630951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates") pod "keda-metrics-apiserver-7c9f485588-cnjj7" (UID: "f1de602c-ced7-49fc-bb79-fb4087c1d85e") : references non-existent secret key: tls.crt Apr 22 20:03:22.395238 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:22.395203 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:22.395647 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:22.395381 2583 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:03:22.395647 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:22.395404 2583 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:03:22.395647 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:22.395427 2583 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7: references non-existent secret key: tls.crt Apr 22 20:03:22.395647 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:22.395499 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates podName:f1de602c-ced7-49fc-bb79-fb4087c1d85e nodeName:}" failed. No retries permitted until 2026-04-22 20:03:24.395480774 +0000 UTC m=+292.815387550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates") pod "keda-metrics-apiserver-7c9f485588-cnjj7" (UID: "f1de602c-ced7-49fc-bb79-fb4087c1d85e") : references non-existent secret key: tls.crt Apr 22 20:03:24.409784 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:24.409746 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:24.410238 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:24.409914 2583 secret.go:281] references non-existent secret key: tls.crt Apr 22 20:03:24.410238 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:24.409934 2583 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 20:03:24.410238 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:24.409954 2583 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7: references non-existent secret key: tls.crt Apr 22 20:03:24.410238 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:03:24.410006 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates podName:f1de602c-ced7-49fc-bb79-fb4087c1d85e nodeName:}" failed. No retries permitted until 2026-04-22 20:03:28.409991007 +0000 UTC m=+296.829897776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates") pod "keda-metrics-apiserver-7c9f485588-cnjj7" (UID: "f1de602c-ced7-49fc-bb79-fb4087c1d85e") : references non-existent secret key: tls.crt Apr 22 20:03:28.443044 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:28.443006 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:28.445691 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:28.445665 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f1de602c-ced7-49fc-bb79-fb4087c1d85e-certificates\") pod \"keda-metrics-apiserver-7c9f485588-cnjj7\" (UID: \"f1de602c-ced7-49fc-bb79-fb4087c1d85e\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:28.477607 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:28.477577 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:28.601702 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:28.601676 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7"] Apr 22 20:03:28.604251 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:03:28.604224 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1de602c_ced7_49fc_bb79_fb4087c1d85e.slice/crio-56a6b8761cb758c8e5f34692331f2af64edaafe415985235a67eff4b902a7d04 WatchSource:0}: Error finding container 56a6b8761cb758c8e5f34692331f2af64edaafe415985235a67eff4b902a7d04: Status 404 returned error can't find the container with id 56a6b8761cb758c8e5f34692331f2af64edaafe415985235a67eff4b902a7d04 Apr 22 20:03:29.142567 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:29.142531 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" event={"ID":"f1de602c-ced7-49fc-bb79-fb4087c1d85e","Type":"ContainerStarted","Data":"56a6b8761cb758c8e5f34692331f2af64edaafe415985235a67eff4b902a7d04"} Apr 22 20:03:32.015783 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:32.015757 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 20:03:33.156513 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:33.156473 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" event={"ID":"f1de602c-ced7-49fc-bb79-fb4087c1d85e","Type":"ContainerStarted","Data":"904459dca3d8a397e3cb9cd515ca1127bf9a14b94bc6a9bfbce5ace47967f379"} Apr 22 20:03:33.156948 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:33.156596 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:03:33.174891 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:33.174831 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" podStartSLOduration=9.587630717 podStartE2EDuration="13.174818049s" podCreationTimestamp="2026-04-22 20:03:20 +0000 UTC" firstStartedPulling="2026-04-22 20:03:28.605461008 +0000 UTC m=+297.025367778" lastFinishedPulling="2026-04-22 20:03:32.192648331 +0000 UTC m=+300.612555110" observedRunningTime="2026-04-22 20:03:33.172449389 +0000 UTC m=+301.592356174" watchObservedRunningTime="2026-04-22 20:03:33.174818049 +0000 UTC m=+301.594724839" Apr 22 20:03:41.119690 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:41.119662 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-knbc5" Apr 22 20:03:44.164912 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:03:44.164879 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-cnjj7" Apr 22 20:04:26.509558 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.509468 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-665c47d676-j9ccm"] Apr 22 20:04:26.511726 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.511707 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:26.514299 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.514280 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 20:04:26.514950 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.514932 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:04:26.515053 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.514949 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-9bmhm\"" Apr 22 20:04:26.515053 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.514932 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:04:26.524249 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.524226 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-j9ccm"] Apr 22 20:04:26.561114 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.561084 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-cr99v"] Apr 22 20:04:26.563383 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.563364 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:26.566322 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.566304 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4hhk5\"" Apr 22 20:04:26.566464 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.566446 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 20:04:26.577103 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.577082 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-cr99v"] Apr 22 20:04:26.655703 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.655667 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41c59d17-f97b-4bd8-b285-84945993f060-cert\") pod \"kserve-controller-manager-665c47d676-j9ccm\" (UID: \"41c59d17-f97b-4bd8-b285-84945993f060\") " pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:26.655937 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.655735 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ef92bce9-74ab-49c5-bd6e-125bd956a39a-data\") pod \"seaweedfs-86cc847c5c-cr99v\" (UID: \"ef92bce9-74ab-49c5-bd6e-125bd956a39a\") " pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:26.655937 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.655759 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5mw5\" (UniqueName: \"kubernetes.io/projected/41c59d17-f97b-4bd8-b285-84945993f060-kube-api-access-w5mw5\") pod \"kserve-controller-manager-665c47d676-j9ccm\" (UID: \"41c59d17-f97b-4bd8-b285-84945993f060\") " pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:26.655937 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.655800 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62t4\" (UniqueName: \"kubernetes.io/projected/ef92bce9-74ab-49c5-bd6e-125bd956a39a-kube-api-access-q62t4\") pod \"seaweedfs-86cc847c5c-cr99v\" (UID: \"ef92bce9-74ab-49c5-bd6e-125bd956a39a\") " pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:26.756602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.756568 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q62t4\" (UniqueName: \"kubernetes.io/projected/ef92bce9-74ab-49c5-bd6e-125bd956a39a-kube-api-access-q62t4\") pod \"seaweedfs-86cc847c5c-cr99v\" (UID: \"ef92bce9-74ab-49c5-bd6e-125bd956a39a\") " pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:26.756759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.756621 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41c59d17-f97b-4bd8-b285-84945993f060-cert\") pod \"kserve-controller-manager-665c47d676-j9ccm\" (UID: \"41c59d17-f97b-4bd8-b285-84945993f060\") " pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:26.756759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.756656 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ef92bce9-74ab-49c5-bd6e-125bd956a39a-data\") pod \"seaweedfs-86cc847c5c-cr99v\" (UID: \"ef92bce9-74ab-49c5-bd6e-125bd956a39a\") " pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:26.756759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.756672 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5mw5\" (UniqueName: \"kubernetes.io/projected/41c59d17-f97b-4bd8-b285-84945993f060-kube-api-access-w5mw5\") pod \"kserve-controller-manager-665c47d676-j9ccm\" (UID: \"41c59d17-f97b-4bd8-b285-84945993f060\") " pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:26.757134 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.757112 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ef92bce9-74ab-49c5-bd6e-125bd956a39a-data\") pod \"seaweedfs-86cc847c5c-cr99v\" (UID: \"ef92bce9-74ab-49c5-bd6e-125bd956a39a\") " pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:26.759193 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.759175 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41c59d17-f97b-4bd8-b285-84945993f060-cert\") pod \"kserve-controller-manager-665c47d676-j9ccm\" (UID: \"41c59d17-f97b-4bd8-b285-84945993f060\") " pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:26.765761 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.765704 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62t4\" (UniqueName: \"kubernetes.io/projected/ef92bce9-74ab-49c5-bd6e-125bd956a39a-kube-api-access-q62t4\") pod \"seaweedfs-86cc847c5c-cr99v\" (UID: \"ef92bce9-74ab-49c5-bd6e-125bd956a39a\") " pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:26.766183 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.766162 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5mw5\" (UniqueName: \"kubernetes.io/projected/41c59d17-f97b-4bd8-b285-84945993f060-kube-api-access-w5mw5\") pod \"kserve-controller-manager-665c47d676-j9ccm\" (UID: \"41c59d17-f97b-4bd8-b285-84945993f060\") " pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:26.821991 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.821963 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:26.872204 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.872090 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:26.952972 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.952936 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-j9ccm"] Apr 22 20:04:26.954953 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:04:26.954925 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c59d17_f97b_4bd8_b285_84945993f060.slice/crio-5e4dddb7ac16a33db3c22b4b62b49758672c31da0ccbf128cfe03a3b039bad68 WatchSource:0}: Error finding container 5e4dddb7ac16a33db3c22b4b62b49758672c31da0ccbf128cfe03a3b039bad68: Status 404 returned error can't find the container with id 5e4dddb7ac16a33db3c22b4b62b49758672c31da0ccbf128cfe03a3b039bad68 Apr 22 20:04:26.956186 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:26.956169 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:04:27.003799 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:27.003777 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-cr99v"] Apr 22 20:04:27.006368 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:04:27.006340 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef92bce9_74ab_49c5_bd6e_125bd956a39a.slice/crio-a54446fba62b2b9cc6c89504cff11534b02774f4c99a87a1d9c728e614695a52 WatchSource:0}: Error finding container a54446fba62b2b9cc6c89504cff11534b02774f4c99a87a1d9c728e614695a52: Status 404 returned error can't find the container with id a54446fba62b2b9cc6c89504cff11534b02774f4c99a87a1d9c728e614695a52 Apr 22 20:04:27.334570 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:27.334537 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-cr99v" event={"ID":"ef92bce9-74ab-49c5-bd6e-125bd956a39a","Type":"ContainerStarted","Data":"a54446fba62b2b9cc6c89504cff11534b02774f4c99a87a1d9c728e614695a52"} Apr 22 20:04:27.335582 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:27.335558 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" event={"ID":"41c59d17-f97b-4bd8-b285-84945993f060","Type":"ContainerStarted","Data":"5e4dddb7ac16a33db3c22b4b62b49758672c31da0ccbf128cfe03a3b039bad68"} Apr 22 20:04:31.352700 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:31.352660 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-cr99v" event={"ID":"ef92bce9-74ab-49c5-bd6e-125bd956a39a","Type":"ContainerStarted","Data":"cdaaca76df10a8d5efe80bf07e2103bade34ea25f4518595bba355a94e3883bf"} Apr 22 20:04:31.353153 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:31.352776 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:04:31.359710 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:31.359681 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" event={"ID":"41c59d17-f97b-4bd8-b285-84945993f060","Type":"ContainerStarted","Data":"2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8"} Apr 22 20:04:31.359899 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:31.359814 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:04:31.370088 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:31.370046 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-cr99v" podStartSLOduration=1.798128647 podStartE2EDuration="5.370034731s" podCreationTimestamp="2026-04-22 20:04:26 +0000 UTC" firstStartedPulling="2026-04-22 20:04:27.007676651 +0000 UTC m=+355.427583421" lastFinishedPulling="2026-04-22 20:04:30.579582731 +0000 UTC m=+358.999489505" observedRunningTime="2026-04-22 20:04:31.368571731 +0000 UTC m=+359.788478523" watchObservedRunningTime="2026-04-22 20:04:31.370034731 +0000 UTC m=+359.789941522" Apr 22 20:04:31.385352 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:31.385306 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" podStartSLOduration=1.864888339 podStartE2EDuration="5.385293115s" podCreationTimestamp="2026-04-22 20:04:26 +0000 UTC" firstStartedPulling="2026-04-22 20:04:26.956289281 +0000 UTC m=+355.376196050" lastFinishedPulling="2026-04-22 20:04:30.476694055 +0000 UTC m=+358.896600826" observedRunningTime="2026-04-22 20:04:31.384510323 +0000 UTC m=+359.804417126" watchObservedRunningTime="2026-04-22 20:04:31.385293115 +0000 UTC m=+359.805199906" Apr 22 20:04:37.365047 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:04:37.365016 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-cr99v" Apr 22 20:05:02.062350 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.060733 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-j9ccm"] Apr 22 20:05:02.062350 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.061132 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" podUID="41c59d17-f97b-4bd8-b285-84945993f060" containerName="manager" containerID="cri-o://2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8" gracePeriod=10 Apr 22 20:05:02.070587 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.070562 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:05:02.081919 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.081891 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-665c47d676-mqq69"] Apr 22 20:05:02.084513 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.084496 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:02.090112 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.090090 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-mqq69"] Apr 22 20:05:02.144241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.144216 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgfm\" (UniqueName: \"kubernetes.io/projected/3c2fc1d0-902a-4917-81bd-610893267e85-kube-api-access-9zgfm\") pod \"kserve-controller-manager-665c47d676-mqq69\" (UID: \"3c2fc1d0-902a-4917-81bd-610893267e85\") " pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:02.144392 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.144360 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c2fc1d0-902a-4917-81bd-610893267e85-cert\") pod \"kserve-controller-manager-665c47d676-mqq69\" (UID: \"3c2fc1d0-902a-4917-81bd-610893267e85\") " pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:02.245630 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.245601 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c2fc1d0-902a-4917-81bd-610893267e85-cert\") pod \"kserve-controller-manager-665c47d676-mqq69\" (UID: \"3c2fc1d0-902a-4917-81bd-610893267e85\") " pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:02.245776 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.245665 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgfm\" (UniqueName: \"kubernetes.io/projected/3c2fc1d0-902a-4917-81bd-610893267e85-kube-api-access-9zgfm\") pod \"kserve-controller-manager-665c47d676-mqq69\" (UID: \"3c2fc1d0-902a-4917-81bd-610893267e85\") " pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:02.248219 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.248193 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c2fc1d0-902a-4917-81bd-610893267e85-cert\") pod \"kserve-controller-manager-665c47d676-mqq69\" (UID: \"3c2fc1d0-902a-4917-81bd-610893267e85\") " pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:02.255622 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.255594 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgfm\" (UniqueName: \"kubernetes.io/projected/3c2fc1d0-902a-4917-81bd-610893267e85-kube-api-access-9zgfm\") pod \"kserve-controller-manager-665c47d676-mqq69\" (UID: \"3c2fc1d0-902a-4917-81bd-610893267e85\") " pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:02.308165 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.308145 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:05:02.346383 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.346302 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41c59d17-f97b-4bd8-b285-84945993f060-cert\") pod \"41c59d17-f97b-4bd8-b285-84945993f060\" (UID: \"41c59d17-f97b-4bd8-b285-84945993f060\") " Apr 22 20:05:02.346383 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.346349 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5mw5\" (UniqueName: \"kubernetes.io/projected/41c59d17-f97b-4bd8-b285-84945993f060-kube-api-access-w5mw5\") pod \"41c59d17-f97b-4bd8-b285-84945993f060\" (UID: \"41c59d17-f97b-4bd8-b285-84945993f060\") " Apr 22 20:05:02.348593 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.348562 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c59d17-f97b-4bd8-b285-84945993f060-kube-api-access-w5mw5" (OuterVolumeSpecName: "kube-api-access-w5mw5") pod "41c59d17-f97b-4bd8-b285-84945993f060" (UID: "41c59d17-f97b-4bd8-b285-84945993f060"). InnerVolumeSpecName "kube-api-access-w5mw5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:05:02.348593 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.348579 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41c59d17-f97b-4bd8-b285-84945993f060-cert" (OuterVolumeSpecName: "cert") pod "41c59d17-f97b-4bd8-b285-84945993f060" (UID: "41c59d17-f97b-4bd8-b285-84945993f060"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:05:02.441396 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.441357 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:02.447392 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.447372 2583 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/41c59d17-f97b-4bd8-b285-84945993f060-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:05:02.447451 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.447396 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5mw5\" (UniqueName: \"kubernetes.io/projected/41c59d17-f97b-4bd8-b285-84945993f060-kube-api-access-w5mw5\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:05:02.465051 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.465020 2583 generic.go:358] "Generic (PLEG): container finished" podID="41c59d17-f97b-4bd8-b285-84945993f060" containerID="2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8" exitCode=0 Apr 22 20:05:02.465196 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.465104 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" Apr 22 20:05:02.465196 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.465109 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" event={"ID":"41c59d17-f97b-4bd8-b285-84945993f060","Type":"ContainerDied","Data":"2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8"} Apr 22 20:05:02.465196 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.465152 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-j9ccm" event={"ID":"41c59d17-f97b-4bd8-b285-84945993f060","Type":"ContainerDied","Data":"5e4dddb7ac16a33db3c22b4b62b49758672c31da0ccbf128cfe03a3b039bad68"} Apr 22 20:05:02.465196 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.465169 2583 scope.go:117] "RemoveContainer" containerID="2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8" Apr 22 20:05:02.473021 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.473005 2583 scope.go:117] "RemoveContainer" containerID="2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8" Apr 22 20:05:02.473278 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:05:02.473261 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8\": container with ID starting with 2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8 not found: ID does not exist" containerID="2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8" Apr 22 20:05:02.473357 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.473292 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8"} err="failed to get container status \"2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8\": rpc error: code = NotFound desc = could not find container \"2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8\": container with ID starting with 2928329cc8328ebe24b9d4219a0c5b6469e374e63f831c910bf3b44437e387d8 not found: ID does not exist" Apr 22 20:05:02.492400 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.492369 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-j9ccm"] Apr 22 20:05:02.497169 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.497137 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-j9ccm"] Apr 22 20:05:02.567500 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:02.567479 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-mqq69"] Apr 22 20:05:02.570205 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:05:02.570123 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c2fc1d0_902a_4917_81bd_610893267e85.slice/crio-20e53ed74765e999d3628f3c8cff90c5ee873235cb93ed54cb88437fe1116c54 WatchSource:0}: Error finding container 20e53ed74765e999d3628f3c8cff90c5ee873235cb93ed54cb88437fe1116c54: Status 404 returned error can't find the container with id 20e53ed74765e999d3628f3c8cff90c5ee873235cb93ed54cb88437fe1116c54 Apr 22 20:05:03.470369 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:03.470335 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-mqq69" event={"ID":"3c2fc1d0-902a-4917-81bd-610893267e85","Type":"ContainerStarted","Data":"c50bef2e8a62af2b254e480d6937d195901460196f79366c2dabae50295a9392"} Apr 22 20:05:03.470369 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:03.470373 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-mqq69" event={"ID":"3c2fc1d0-902a-4917-81bd-610893267e85","Type":"ContainerStarted","Data":"20e53ed74765e999d3628f3c8cff90c5ee873235cb93ed54cb88437fe1116c54"} Apr 22 20:05:03.470955 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:03.470388 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:03.489983 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:03.489941 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-665c47d676-mqq69" podStartSLOduration=1.173319698 podStartE2EDuration="1.489926105s" podCreationTimestamp="2026-04-22 20:05:02 +0000 UTC" firstStartedPulling="2026-04-22 20:05:02.573212681 +0000 UTC m=+390.993119450" lastFinishedPulling="2026-04-22 20:05:02.889819084 +0000 UTC m=+391.309725857" observedRunningTime="2026-04-22 20:05:03.488754239 +0000 UTC m=+391.908661031" watchObservedRunningTime="2026-04-22 20:05:03.489926105 +0000 UTC m=+391.909832895" Apr 22 20:05:04.096729 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:04.096700 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c59d17-f97b-4bd8-b285-84945993f060" path="/var/lib/kubelet/pods/41c59d17-f97b-4bd8-b285-84945993f060/volumes" Apr 22 20:05:34.479652 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:34.479619 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-665c47d676-mqq69" Apr 22 20:05:35.303076 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.303046 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-7x4d2"] Apr 22 20:05:35.303411 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.303398 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41c59d17-f97b-4bd8-b285-84945993f060" containerName="manager" Apr 22 20:05:35.303458 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.303413 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c59d17-f97b-4bd8-b285-84945993f060" containerName="manager" Apr 22 20:05:35.303529 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.303476 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="41c59d17-f97b-4bd8-b285-84945993f060" containerName="manager" Apr 22 20:05:35.306704 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.306686 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:35.308713 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.308693 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 20:05:35.308816 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.308799 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-tqt24\"" Apr 22 20:05:35.315732 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.315709 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-7x4d2"] Apr 22 20:05:35.318258 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.318238 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-tllxn"] Apr 22 20:05:35.321438 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.321421 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:35.323489 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.323472 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 20:05:35.323615 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.323594 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-2df8z\"" Apr 22 20:05:35.330921 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.330897 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-tllxn"] Apr 22 20:05:35.429823 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.429790 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m87zr\" (UniqueName: \"kubernetes.io/projected/7d6641be-ac29-4946-b17c-de887e423d4e-kube-api-access-m87zr\") pod \"odh-model-controller-696fc77849-tllxn\" (UID: \"7d6641be-ac29-4946-b17c-de887e423d4e\") " pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:35.430014 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.429847 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d6641be-ac29-4946-b17c-de887e423d4e-cert\") pod \"odh-model-controller-696fc77849-tllxn\" (UID: \"7d6641be-ac29-4946-b17c-de887e423d4e\") " pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:35.430014 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.429940 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9gpr\" (UniqueName: \"kubernetes.io/projected/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-kube-api-access-g9gpr\") pod \"model-serving-api-86f7b4b499-7x4d2\" (UID: \"2b7b19da-cb88-46a4-8640-dc8caa8bae0a\") " pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:35.430014 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.430001 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-tls-certs\") pod \"model-serving-api-86f7b4b499-7x4d2\" (UID: \"2b7b19da-cb88-46a4-8640-dc8caa8bae0a\") " pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:35.530926 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.530887 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d6641be-ac29-4946-b17c-de887e423d4e-cert\") pod \"odh-model-controller-696fc77849-tllxn\" (UID: \"7d6641be-ac29-4946-b17c-de887e423d4e\") " pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:35.531398 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.530972 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9gpr\" (UniqueName: \"kubernetes.io/projected/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-kube-api-access-g9gpr\") pod \"model-serving-api-86f7b4b499-7x4d2\" (UID: \"2b7b19da-cb88-46a4-8640-dc8caa8bae0a\") " pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:35.531398 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.531008 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-tls-certs\") pod \"model-serving-api-86f7b4b499-7x4d2\" (UID: \"2b7b19da-cb88-46a4-8640-dc8caa8bae0a\") " pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:35.531398 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.531081 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m87zr\" (UniqueName: \"kubernetes.io/projected/7d6641be-ac29-4946-b17c-de887e423d4e-kube-api-access-m87zr\") pod \"odh-model-controller-696fc77849-tllxn\" (UID: \"7d6641be-ac29-4946-b17c-de887e423d4e\") " pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:35.531398 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:05:35.531143 2583 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 22 20:05:35.531398 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:05:35.531211 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-tls-certs podName:2b7b19da-cb88-46a4-8640-dc8caa8bae0a nodeName:}" failed. No retries permitted until 2026-04-22 20:05:36.031195366 +0000 UTC m=+424.451102135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-tls-certs") pod "model-serving-api-86f7b4b499-7x4d2" (UID: "2b7b19da-cb88-46a4-8640-dc8caa8bae0a") : secret "model-serving-api-tls" not found Apr 22 20:05:35.533487 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.533460 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d6641be-ac29-4946-b17c-de887e423d4e-cert\") pod \"odh-model-controller-696fc77849-tllxn\" (UID: \"7d6641be-ac29-4946-b17c-de887e423d4e\") " pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:35.541710 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.541686 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m87zr\" (UniqueName: \"kubernetes.io/projected/7d6641be-ac29-4946-b17c-de887e423d4e-kube-api-access-m87zr\") pod \"odh-model-controller-696fc77849-tllxn\" (UID: \"7d6641be-ac29-4946-b17c-de887e423d4e\") " pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:35.542190 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.542171 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9gpr\" (UniqueName: \"kubernetes.io/projected/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-kube-api-access-g9gpr\") pod \"model-serving-api-86f7b4b499-7x4d2\" (UID: \"2b7b19da-cb88-46a4-8640-dc8caa8bae0a\") " pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:35.632700 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.632626 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:35.754107 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:35.754087 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-tllxn"] Apr 22 20:05:35.756171 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:05:35.756142 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d6641be_ac29_4946_b17c_de887e423d4e.slice/crio-cce8a21de29c1293308943125483ba024ddad26af099003b2655e8abc6ec1df0 WatchSource:0}: Error finding container cce8a21de29c1293308943125483ba024ddad26af099003b2655e8abc6ec1df0: Status 404 returned error can't find the container with id cce8a21de29c1293308943125483ba024ddad26af099003b2655e8abc6ec1df0 Apr 22 20:05:36.035924 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:36.035855 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-tls-certs\") pod \"model-serving-api-86f7b4b499-7x4d2\" (UID: \"2b7b19da-cb88-46a4-8640-dc8caa8bae0a\") " pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:36.038516 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:36.038492 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b7b19da-cb88-46a4-8640-dc8caa8bae0a-tls-certs\") pod \"model-serving-api-86f7b4b499-7x4d2\" (UID: \"2b7b19da-cb88-46a4-8640-dc8caa8bae0a\") " pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:36.217814 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:36.217775 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:36.344589 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:36.344564 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-7x4d2"] Apr 22 20:05:36.347310 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:05:36.347268 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b7b19da_cb88_46a4_8640_dc8caa8bae0a.slice/crio-c173c5faf476d9f5597eccbced005713eea1ed153410d12c237a070ed1233935 WatchSource:0}: Error finding container c173c5faf476d9f5597eccbced005713eea1ed153410d12c237a070ed1233935: Status 404 returned error can't find the container with id c173c5faf476d9f5597eccbced005713eea1ed153410d12c237a070ed1233935 Apr 22 20:05:36.587499 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:36.587404 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-7x4d2" event={"ID":"2b7b19da-cb88-46a4-8640-dc8caa8bae0a","Type":"ContainerStarted","Data":"c173c5faf476d9f5597eccbced005713eea1ed153410d12c237a070ed1233935"} Apr 22 20:05:36.589033 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:36.588999 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-tllxn" event={"ID":"7d6641be-ac29-4946-b17c-de887e423d4e","Type":"ContainerStarted","Data":"cce8a21de29c1293308943125483ba024ddad26af099003b2655e8abc6ec1df0"} Apr 22 20:05:39.600149 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:39.600112 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-7x4d2" event={"ID":"2b7b19da-cb88-46a4-8640-dc8caa8bae0a","Type":"ContainerStarted","Data":"022077c6850db46373049dcd004631bdfcdbee5eb24d4f886df48fed693e7f7c"} Apr 22 20:05:39.600620 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:39.600269 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:39.601566 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:39.601534 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-tllxn" event={"ID":"7d6641be-ac29-4946-b17c-de887e423d4e","Type":"ContainerStarted","Data":"d8aa41b3835cce279718334d839b82399d2588c842356ef5a55d97fe4fa3a083"} Apr 22 20:05:39.601719 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:39.601693 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:39.615805 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:39.615763 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-7x4d2" podStartSLOduration=2.180713905 podStartE2EDuration="4.615747692s" podCreationTimestamp="2026-04-22 20:05:35 +0000 UTC" firstStartedPulling="2026-04-22 20:05:36.349606291 +0000 UTC m=+424.769513064" lastFinishedPulling="2026-04-22 20:05:38.784640081 +0000 UTC m=+427.204546851" observedRunningTime="2026-04-22 20:05:39.614477243 +0000 UTC m=+428.034384061" watchObservedRunningTime="2026-04-22 20:05:39.615747692 +0000 UTC m=+428.035654484" Apr 22 20:05:39.631102 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:39.631054 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-tllxn" podStartSLOduration=1.601098615 podStartE2EDuration="4.631040316s" podCreationTimestamp="2026-04-22 20:05:35 +0000 UTC" firstStartedPulling="2026-04-22 20:05:35.757507345 +0000 UTC m=+424.177414114" lastFinishedPulling="2026-04-22 20:05:38.787449037 +0000 UTC m=+427.207355815" observedRunningTime="2026-04-22 20:05:39.630608002 +0000 UTC m=+428.050514787" watchObservedRunningTime="2026-04-22 20:05:39.631040316 +0000 UTC m=+428.050947108" Apr 22 20:05:50.608298 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:50.608258 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-tllxn" Apr 22 20:05:50.610825 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:50.610800 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-7x4d2" Apr 22 20:05:53.688454 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.688415 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f6fd456cd-mtx2p"] Apr 22 20:05:53.691873 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.691845 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.700850 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.700826 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6fd456cd-mtx2p"] Apr 22 20:05:53.784242 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.784206 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpnw5\" (UniqueName: \"kubernetes.io/projected/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-kube-api-access-jpnw5\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.784242 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.784246 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-service-ca\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.784449 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.784272 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-oauth-serving-cert\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.784449 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.784378 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-trusted-ca-bundle\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.784449 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.784414 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-serving-cert\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.784546 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.784462 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-oauth-config\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.784546 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.784540 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-config\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.885481 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.885446 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-serving-cert\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.885481 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.885487 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-oauth-config\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.885734 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.885522 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-config\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.885734 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.885546 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpnw5\" (UniqueName: \"kubernetes.io/projected/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-kube-api-access-jpnw5\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.885734 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.885563 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-service-ca\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.885734 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.885577 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-oauth-serving-cert\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.885734 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.885709 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-trusted-ca-bundle\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.886494 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.886469 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-oauth-serving-cert\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.886591 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.886534 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-config\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.886591 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.886532 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-service-ca\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.886902 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.886850 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-trusted-ca-bundle\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.888174 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.888151 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-serving-cert\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.888300 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.888280 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-console-oauth-config\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:53.893074 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:53.893057 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpnw5\" (UniqueName: \"kubernetes.io/projected/6bd5a835-01f0-4b34-9121-a4ba62fa8ca5-kube-api-access-jpnw5\") pod \"console-5f6fd456cd-mtx2p\" (UID: \"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5\") " pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:54.002196 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:54.002107 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:05:54.128668 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:54.128628 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6fd456cd-mtx2p"] Apr 22 20:05:54.130935 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:05:54.130911 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd5a835_01f0_4b34_9121_a4ba62fa8ca5.slice/crio-7829744a77406161af920bb9efae9273102b5399b6b57aeb7b49b4762332b29b WatchSource:0}: Error finding container 7829744a77406161af920bb9efae9273102b5399b6b57aeb7b49b4762332b29b: Status 404 returned error can't find the container with id 7829744a77406161af920bb9efae9273102b5399b6b57aeb7b49b4762332b29b Apr 22 20:05:54.663397 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:54.663362 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6fd456cd-mtx2p" event={"ID":"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5","Type":"ContainerStarted","Data":"c047bd7a87a07295b08d7893bee7b5314b58fcb1d345408c9ea524c85eb0ade9"} Apr 22 20:05:54.663397 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:54.663399 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6fd456cd-mtx2p" event={"ID":"6bd5a835-01f0-4b34-9121-a4ba62fa8ca5","Type":"ContainerStarted","Data":"7829744a77406161af920bb9efae9273102b5399b6b57aeb7b49b4762332b29b"} Apr 22 20:05:54.681701 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:05:54.681647 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f6fd456cd-mtx2p" podStartSLOduration=1.6816340159999998 podStartE2EDuration="1.681634016s" podCreationTimestamp="2026-04-22 20:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:05:54.679717211 +0000 UTC m=+443.099624004" watchObservedRunningTime="2026-04-22 20:05:54.681634016 +0000 UTC m=+443.101540807" Apr 22 20:06:02.139791 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.139760 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt"] Apr 22 20:06:02.143328 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.143296 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:02.145971 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.145954 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 20:06:02.150214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.150187 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt"] Apr 22 20:06:02.155447 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.155425 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0c332dbd-0575-4581-85ed-fbb42587b65a-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq6wt\" (UID: \"0c332dbd-0575-4581-85ed-fbb42587b65a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:02.155530 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.155475 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjc5d\" (UniqueName: \"kubernetes.io/projected/0c332dbd-0575-4581-85ed-fbb42587b65a-kube-api-access-hjc5d\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq6wt\" (UID: \"0c332dbd-0575-4581-85ed-fbb42587b65a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:02.256020 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.255986 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0c332dbd-0575-4581-85ed-fbb42587b65a-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq6wt\" (UID: \"0c332dbd-0575-4581-85ed-fbb42587b65a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:02.256182 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.256041 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjc5d\" (UniqueName: \"kubernetes.io/projected/0c332dbd-0575-4581-85ed-fbb42587b65a-kube-api-access-hjc5d\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq6wt\" (UID: \"0c332dbd-0575-4581-85ed-fbb42587b65a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:02.256359 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.256343 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0c332dbd-0575-4581-85ed-fbb42587b65a-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq6wt\" (UID: \"0c332dbd-0575-4581-85ed-fbb42587b65a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:02.265217 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.265188 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjc5d\" (UniqueName: \"kubernetes.io/projected/0c332dbd-0575-4581-85ed-fbb42587b65a-kube-api-access-hjc5d\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq6wt\" (UID: \"0c332dbd-0575-4581-85ed-fbb42587b65a\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:02.453920 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.453886 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:02.597319 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.597296 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt"] Apr 22 20:06:02.599377 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:06:02.599348 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c332dbd_0575_4581_85ed_fbb42587b65a.slice/crio-ea50336f3865b7a0594306bd6a7c0c137704def991e84aae325474d95053248e WatchSource:0}: Error finding container ea50336f3865b7a0594306bd6a7c0c137704def991e84aae325474d95053248e: Status 404 returned error can't find the container with id ea50336f3865b7a0594306bd6a7c0c137704def991e84aae325474d95053248e Apr 22 20:06:02.690383 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:02.690347 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" event={"ID":"0c332dbd-0575-4581-85ed-fbb42587b65a","Type":"ContainerStarted","Data":"ea50336f3865b7a0594306bd6a7c0c137704def991e84aae325474d95053248e"} Apr 22 20:06:03.700282 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:03.700247 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" event={"ID":"0c332dbd-0575-4581-85ed-fbb42587b65a","Type":"ContainerStarted","Data":"5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641"} Apr 22 20:06:03.718695 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:03.718634 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" podStartSLOduration=1.451446321 podStartE2EDuration="1.718618834s" podCreationTimestamp="2026-04-22 20:06:02 +0000 UTC" firstStartedPulling="2026-04-22 20:06:02.600988363 +0000 UTC m=+451.020895132" lastFinishedPulling="2026-04-22 20:06:02.868160874 +0000 UTC m=+451.288067645" observedRunningTime="2026-04-22 20:06:03.71761554 +0000 UTC m=+452.137522342" watchObservedRunningTime="2026-04-22 20:06:03.718618834 +0000 UTC m=+452.138525628" Apr 22 20:06:04.003048 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:04.002948 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:06:04.003048 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:04.002987 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:06:04.007604 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:04.007582 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:06:04.669949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:04.669914 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt"] Apr 22 20:06:04.707412 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:04.707381 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f6fd456cd-mtx2p" Apr 22 20:06:04.763266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:04.763232 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b784fffc-znwsh"] Apr 22 20:06:05.706706 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:05.706666 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" podUID="0c332dbd-0575-4581-85ed-fbb42587b65a" containerName="seaweedfs-tls-custom" containerID="cri-o://5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641" gracePeriod=30 Apr 22 20:06:06.947937 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:06.947913 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:07.001683 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.001611 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjc5d\" (UniqueName: \"kubernetes.io/projected/0c332dbd-0575-4581-85ed-fbb42587b65a-kube-api-access-hjc5d\") pod \"0c332dbd-0575-4581-85ed-fbb42587b65a\" (UID: \"0c332dbd-0575-4581-85ed-fbb42587b65a\") " Apr 22 20:06:07.001683 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.001663 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0c332dbd-0575-4581-85ed-fbb42587b65a-data\") pod \"0c332dbd-0575-4581-85ed-fbb42587b65a\" (UID: \"0c332dbd-0575-4581-85ed-fbb42587b65a\") " Apr 22 20:06:07.002990 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.002963 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c332dbd-0575-4581-85ed-fbb42587b65a-data" (OuterVolumeSpecName: "data") pod "0c332dbd-0575-4581-85ed-fbb42587b65a" (UID: "0c332dbd-0575-4581-85ed-fbb42587b65a"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:06:07.003704 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.003676 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c332dbd-0575-4581-85ed-fbb42587b65a-kube-api-access-hjc5d" (OuterVolumeSpecName: "kube-api-access-hjc5d") pod "0c332dbd-0575-4581-85ed-fbb42587b65a" (UID: "0c332dbd-0575-4581-85ed-fbb42587b65a"). InnerVolumeSpecName "kube-api-access-hjc5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:06:07.102957 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.102926 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjc5d\" (UniqueName: \"kubernetes.io/projected/0c332dbd-0575-4581-85ed-fbb42587b65a-kube-api-access-hjc5d\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:07.102957 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.102950 2583 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0c332dbd-0575-4581-85ed-fbb42587b65a-data\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:07.715767 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.715729 2583 generic.go:358] "Generic (PLEG): container finished" podID="0c332dbd-0575-4581-85ed-fbb42587b65a" containerID="5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641" exitCode=0 Apr 22 20:06:07.715967 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.715793 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" Apr 22 20:06:07.715967 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.715804 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" event={"ID":"0c332dbd-0575-4581-85ed-fbb42587b65a","Type":"ContainerDied","Data":"5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641"} Apr 22 20:06:07.715967 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.715839 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt" event={"ID":"0c332dbd-0575-4581-85ed-fbb42587b65a","Type":"ContainerDied","Data":"ea50336f3865b7a0594306bd6a7c0c137704def991e84aae325474d95053248e"} Apr 22 20:06:07.715967 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.715876 2583 scope.go:117] "RemoveContainer" containerID="5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641" Apr 22 20:06:07.725125 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.725108 2583 scope.go:117] "RemoveContainer" containerID="5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641" Apr 22 20:06:07.725373 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:06:07.725357 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641\": container with ID starting with 5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641 not found: ID does not exist" containerID="5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641" Apr 22 20:06:07.725419 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.725380 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641"} err="failed to get container status \"5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641\": rpc error: code = NotFound desc = could not find container \"5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641\": container with ID starting with 5c5f73850c0381188b8b8cd892bdb722b44a45e7e0dcaba83ca024d53e723641 not found: ID does not exist" Apr 22 20:06:07.736090 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.736065 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt"] Apr 22 20:06:07.739745 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:07.739726 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq6wt"] Apr 22 20:06:08.096896 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:08.096799 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c332dbd-0575-4581-85ed-fbb42587b65a" path="/var/lib/kubelet/pods/0c332dbd-0575-4581-85ed-fbb42587b65a/volumes" Apr 22 20:06:19.972572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:19.972535 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-kv99x"] Apr 22 20:06:19.973165 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:19.973145 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c332dbd-0575-4581-85ed-fbb42587b65a" containerName="seaweedfs-tls-custom" Apr 22 20:06:19.973249 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:19.973170 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c332dbd-0575-4581-85ed-fbb42587b65a" containerName="seaweedfs-tls-custom" Apr 22 20:06:19.973300 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:19.973258 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c332dbd-0575-4581-85ed-fbb42587b65a" containerName="seaweedfs-tls-custom" Apr 22 20:06:19.976078 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:19.976056 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:19.978211 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:19.978188 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 22 20:06:19.978329 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:19.978191 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 22 20:06:19.983260 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:19.983230 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-kv99x"] Apr 22 20:06:20.120042 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.120007 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ef1571df-a071-44df-b7f2-82276f69ec1d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.120226 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.120062 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc44g\" (UniqueName: \"kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-kube-api-access-jc44g\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.120226 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.120090 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.221263 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.221230 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ef1571df-a071-44df-b7f2-82276f69ec1d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.221437 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.221288 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jc44g\" (UniqueName: \"kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-kube-api-access-jc44g\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.221437 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.221328 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.221535 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:06:20.221434 2583 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 22 20:06:20.221535 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:06:20.221450 2583 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-kv99x: secret "seaweedfs-tls-serving" not found Apr 22 20:06:20.221613 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:06:20.221536 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-seaweedfs-tls-serving podName:ef1571df-a071-44df-b7f2-82276f69ec1d nodeName:}" failed. No retries permitted until 2026-04-22 20:06:20.721508563 +0000 UTC m=+469.141415344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-kv99x" (UID: "ef1571df-a071-44df-b7f2-82276f69ec1d") : secret "seaweedfs-tls-serving" not found Apr 22 20:06:20.221676 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.221657 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ef1571df-a071-44df-b7f2-82276f69ec1d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.230032 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.229975 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc44g\" (UniqueName: \"kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-kube-api-access-jc44g\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.726496 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.726462 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.729021 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.728999 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/ef1571df-a071-44df-b7f2-82276f69ec1d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-kv99x\" (UID: \"ef1571df-a071-44df-b7f2-82276f69ec1d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:20.886544 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:20.886508 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" Apr 22 20:06:21.009486 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:21.009462 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-kv99x"] Apr 22 20:06:21.011534 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:06:21.011505 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef1571df_a071_44df_b7f2_82276f69ec1d.slice/crio-3c49948de534ab139dbb15c7710e8e21f6341241092f88aa8426e16686fe5785 WatchSource:0}: Error finding container 3c49948de534ab139dbb15c7710e8e21f6341241092f88aa8426e16686fe5785: Status 404 returned error can't find the container with id 3c49948de534ab139dbb15c7710e8e21f6341241092f88aa8426e16686fe5785 Apr 22 20:06:21.770099 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:21.770059 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" event={"ID":"ef1571df-a071-44df-b7f2-82276f69ec1d","Type":"ContainerStarted","Data":"0d43eefc10f9458473f776abfa7c387c404a81bed604432435944a24e61eb1ff"} Apr 22 20:06:21.770099 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:21.770098 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" event={"ID":"ef1571df-a071-44df-b7f2-82276f69ec1d","Type":"ContainerStarted","Data":"3c49948de534ab139dbb15c7710e8e21f6341241092f88aa8426e16686fe5785"} Apr 22 20:06:21.785501 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:21.785451 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-kv99x" podStartSLOduration=2.496824834 podStartE2EDuration="2.78543937s" podCreationTimestamp="2026-04-22 20:06:19 +0000 UTC" firstStartedPulling="2026-04-22 20:06:21.012850094 +0000 UTC m=+469.432756869" lastFinishedPulling="2026-04-22 20:06:21.301464636 +0000 UTC m=+469.721371405" observedRunningTime="2026-04-22 20:06:21.783204007 +0000 UTC m=+470.203110799" watchObservedRunningTime="2026-04-22 20:06:21.78543937 +0000 UTC m=+470.205346204" Apr 22 20:06:29.785105 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:29.785044 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69b784fffc-znwsh" podUID="fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" containerName="console" containerID="cri-o://07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c" gracePeriod=15 Apr 22 20:06:29.804720 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:29.804693 2583 patch_prober.go:28] interesting pod/console-69b784fffc-znwsh container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.24:8443/health\": dial tcp 10.134.0.24:8443: connect: connection refused" start-of-body= Apr 22 20:06:29.804821 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:29.804739 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-69b784fffc-znwsh" podUID="fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" containerName="console" probeResult="failure" output="Get \"https://10.134.0.24:8443/health\": dial tcp 10.134.0.24:8443: connect: connection refused" Apr 22 20:06:30.032252 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.032227 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b784fffc-znwsh_fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279/console/0.log" Apr 22 20:06:30.032379 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.032288 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:06:30.115259 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115173 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-serving-cert\") pod \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " Apr 22 20:06:30.115259 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115221 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-service-ca\") pod \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " Apr 22 20:06:30.115259 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115251 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-trusted-ca-bundle\") pod \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " Apr 22 20:06:30.115600 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115281 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-oauth-config\") pod \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " Apr 22 20:06:30.115600 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115328 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-oauth-serving-cert\") pod \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " Apr 22 20:06:30.115600 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115359 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t97jb\" (UniqueName: \"kubernetes.io/projected/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-kube-api-access-t97jb\") pod \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " Apr 22 20:06:30.115600 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115442 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-config\") pod \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\" (UID: \"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279\") " Apr 22 20:06:30.115804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115633 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-service-ca" (OuterVolumeSpecName: "service-ca") pod "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" (UID: "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:06:30.115804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115722 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-service-ca\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:30.115804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115729 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" (UID: "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:06:30.115973 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115948 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-config" (OuterVolumeSpecName: "console-config") pod "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" (UID: "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:06:30.116029 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.115977 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" (UID: "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:06:30.117713 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.117689 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" (UID: "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:06:30.117845 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.117822 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" (UID: "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:06:30.118198 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.118168 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-kube-api-access-t97jb" (OuterVolumeSpecName: "kube-api-access-t97jb") pod "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" (UID: "fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279"). InnerVolumeSpecName "kube-api-access-t97jb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:06:30.216894 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.216744 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-oauth-serving-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:30.216894 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.216782 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t97jb\" (UniqueName: \"kubernetes.io/projected/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-kube-api-access-t97jb\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:30.216894 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.216798 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-config\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:30.216894 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.216814 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-serving-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:30.216894 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.216827 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-trusted-ca-bundle\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:30.216894 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.216843 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279-console-oauth-config\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:06:30.801622 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.801590 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b784fffc-znwsh_fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279/console/0.log" Apr 22 20:06:30.802167 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.801636 2583 generic.go:358] "Generic (PLEG): container finished" podID="fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" containerID="07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c" exitCode=2 Apr 22 20:06:30.802167 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.801696 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b784fffc-znwsh" Apr 22 20:06:30.802167 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.801724 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b784fffc-znwsh" event={"ID":"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279","Type":"ContainerDied","Data":"07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c"} Apr 22 20:06:30.802167 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.801761 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b784fffc-znwsh" event={"ID":"fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279","Type":"ContainerDied","Data":"4eff45c6b361ecd2b3973c1c195a40a6e0a310faacd39c8ed6429bce5c1c44e4"} Apr 22 20:06:30.802167 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.801780 2583 scope.go:117] "RemoveContainer" containerID="07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c" Apr 22 20:06:30.810772 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.810752 2583 scope.go:117] "RemoveContainer" containerID="07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c" Apr 22 20:06:30.811063 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:06:30.811046 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c\": container with ID starting with 07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c not found: ID does not exist" containerID="07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c" Apr 22 20:06:30.811138 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.811069 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c"} err="failed to get container status \"07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c\": rpc error: code = NotFound desc = could not find container \"07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c\": container with ID starting with 07bed19084fc84d5265272adc872dd07ee01fc72357ef16d5f16ff2fc42f337c not found: ID does not exist" Apr 22 20:06:30.822934 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.822896 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b784fffc-znwsh"] Apr 22 20:06:30.824755 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:30.824738 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69b784fffc-znwsh"] Apr 22 20:06:32.097439 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:32.097406 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" path="/var/lib/kubelet/pods/fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279/volumes" Apr 22 20:06:37.929778 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:37.929739 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw"] Apr 22 20:06:37.930324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:37.930306 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" containerName="console" Apr 22 20:06:37.930324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:37.930326 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" containerName="console" Apr 22 20:06:37.930442 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:37.930399 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbf8a4ae-6fb7-4b99-a716-bb0c04dc7279" containerName="console" Apr 22 20:06:37.933369 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:37.933348 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:06:37.935455 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:37.935435 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:06:37.940357 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:37.940334 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw"] Apr 22 20:06:38.085148 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:38.085116 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/994d1edd-7ea3-4d38-b647-0ba1b9f11f58-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw\" (UID: \"994d1edd-7ea3-4d38-b647-0ba1b9f11f58\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:06:38.186379 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:38.186288 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/994d1edd-7ea3-4d38-b647-0ba1b9f11f58-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw\" (UID: \"994d1edd-7ea3-4d38-b647-0ba1b9f11f58\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:06:38.186683 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:38.186664 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/994d1edd-7ea3-4d38-b647-0ba1b9f11f58-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw\" (UID: \"994d1edd-7ea3-4d38-b647-0ba1b9f11f58\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:06:38.245975 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:38.245946 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:06:38.373802 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:38.373778 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw"] Apr 22 20:06:38.376484 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:06:38.376453 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994d1edd_7ea3_4d38_b647_0ba1b9f11f58.slice/crio-51dea5d8cca2a3d7764bbe032f12526c325baf514fc79e8df591e8a3e4bad94c WatchSource:0}: Error finding container 51dea5d8cca2a3d7764bbe032f12526c325baf514fc79e8df591e8a3e4bad94c: Status 404 returned error can't find the container with id 51dea5d8cca2a3d7764bbe032f12526c325baf514fc79e8df591e8a3e4bad94c Apr 22 20:06:38.833901 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:38.833837 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" event={"ID":"994d1edd-7ea3-4d38-b647-0ba1b9f11f58","Type":"ContainerStarted","Data":"51dea5d8cca2a3d7764bbe032f12526c325baf514fc79e8df591e8a3e4bad94c"} Apr 22 20:06:43.855215 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:43.855181 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" event={"ID":"994d1edd-7ea3-4d38-b647-0ba1b9f11f58","Type":"ContainerStarted","Data":"862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372"} Apr 22 20:06:47.869350 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:47.869316 2583 generic.go:358] "Generic (PLEG): container finished" podID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerID="862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372" exitCode=0 Apr 22 20:06:47.869760 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:06:47.869389 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" event={"ID":"994d1edd-7ea3-4d38-b647-0ba1b9f11f58","Type":"ContainerDied","Data":"862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372"} Apr 22 20:07:01.936007 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:01.935973 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" event={"ID":"994d1edd-7ea3-4d38-b647-0ba1b9f11f58","Type":"ContainerStarted","Data":"8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0"} Apr 22 20:07:03.946007 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:03.945967 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" event={"ID":"994d1edd-7ea3-4d38-b647-0ba1b9f11f58","Type":"ContainerStarted","Data":"a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45"} Apr 22 20:07:03.946488 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:03.946292 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:07:03.947464 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:03.947439 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:07:03.960887 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:03.960599 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podStartSLOduration=1.599874638 podStartE2EDuration="26.960585216s" podCreationTimestamp="2026-04-22 20:06:37 +0000 UTC" firstStartedPulling="2026-04-22 20:06:38.378517448 +0000 UTC m=+486.798424218" lastFinishedPulling="2026-04-22 20:07:03.739228025 +0000 UTC m=+512.159134796" observedRunningTime="2026-04-22 20:07:03.960417949 +0000 UTC m=+512.380324755" watchObservedRunningTime="2026-04-22 20:07:03.960585216 +0000 UTC m=+512.380492009" Apr 22 20:07:04.949534 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:04.949486 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:07:04.949967 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:04.949579 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:07:04.950440 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:04.950418 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:07:05.953542 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:05.953495 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:07:05.953985 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:05.953833 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:07:15.954064 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:15.954009 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:07:15.954512 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:15.954431 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:07:25.953699 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:25.953610 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:07:25.954098 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:25.954009 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:07:35.954405 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:35.954344 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:07:35.954948 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:35.954750 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:07:45.954234 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:45.954188 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:07:45.954676 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:45.954655 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:07:55.954152 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:55.954091 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:07:55.954567 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:07:55.954529 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:05.954736 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:05.954700 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:08:05.955124 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:05.954782 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:08:13.088490 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.088456 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw"] Apr 22 20:08:13.089061 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.088809 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" containerID="cri-o://8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0" gracePeriod=30 Apr 22 20:08:13.089061 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.088896 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" containerID="cri-o://a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45" gracePeriod=30 Apr 22 20:08:13.157974 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.157940 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm"] Apr 22 20:08:13.160577 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.160557 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:08:13.170070 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.170044 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm"] Apr 22 20:08:13.238435 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.238397 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b14f346a-b1cc-4033-a89d-d868ec5538db-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm\" (UID: \"b14f346a-b1cc-4033-a89d-d868ec5538db\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:08:13.339554 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.339466 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b14f346a-b1cc-4033-a89d-d868ec5538db-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm\" (UID: \"b14f346a-b1cc-4033-a89d-d868ec5538db\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:08:13.339831 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.339809 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b14f346a-b1cc-4033-a89d-d868ec5538db-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm\" (UID: \"b14f346a-b1cc-4033-a89d-d868ec5538db\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:08:13.472472 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.472431 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:08:13.610539 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:13.610510 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm"] Apr 22 20:08:13.613033 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:08:13.612996 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb14f346a_b1cc_4033_a89d_d868ec5538db.slice/crio-f66bd2dae0ad75d883cc0ecdf72fcbf09c1563a78c2e43993f5182bac421faaa WatchSource:0}: Error finding container f66bd2dae0ad75d883cc0ecdf72fcbf09c1563a78c2e43993f5182bac421faaa: Status 404 returned error can't find the container with id f66bd2dae0ad75d883cc0ecdf72fcbf09c1563a78c2e43993f5182bac421faaa Apr 22 20:08:14.196027 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:14.195995 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" event={"ID":"b14f346a-b1cc-4033-a89d-d868ec5538db","Type":"ContainerStarted","Data":"a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b"} Apr 22 20:08:14.196027 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:14.196033 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" event={"ID":"b14f346a-b1cc-4033-a89d-d868ec5538db","Type":"ContainerStarted","Data":"f66bd2dae0ad75d883cc0ecdf72fcbf09c1563a78c2e43993f5182bac421faaa"} Apr 22 20:08:15.953989 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:15.953929 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:08:15.954381 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:15.954208 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:18.212700 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:18.212668 2583 generic.go:358] "Generic (PLEG): container finished" podID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerID="8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0" exitCode=0 Apr 22 20:08:18.213173 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:18.212735 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" event={"ID":"994d1edd-7ea3-4d38-b647-0ba1b9f11f58","Type":"ContainerDied","Data":"8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0"} Apr 22 20:08:18.214136 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:18.214113 2583 generic.go:358] "Generic (PLEG): container finished" podID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerID="a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b" exitCode=0 Apr 22 20:08:18.214263 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:18.214147 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" event={"ID":"b14f346a-b1cc-4033-a89d-d868ec5538db","Type":"ContainerDied","Data":"a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b"} Apr 22 20:08:19.219725 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:19.219686 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" event={"ID":"b14f346a-b1cc-4033-a89d-d868ec5538db","Type":"ContainerStarted","Data":"37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c"} Apr 22 20:08:19.219725 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:19.219738 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" event={"ID":"b14f346a-b1cc-4033-a89d-d868ec5538db","Type":"ContainerStarted","Data":"b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079"} Apr 22 20:08:19.220248 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:19.220146 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:08:19.220248 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:19.220175 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:08:19.221745 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:19.221722 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:08:19.222396 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:19.222365 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:19.238881 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:19.238812 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podStartSLOduration=6.23880009 podStartE2EDuration="6.23880009s" podCreationTimestamp="2026-04-22 20:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:08:19.237033568 +0000 UTC m=+587.656940360" watchObservedRunningTime="2026-04-22 20:08:19.23880009 +0000 UTC m=+587.658706878" Apr 22 20:08:20.223903 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:20.223841 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:08:20.224372 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:20.224193 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:25.953978 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:25.953930 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:08:25.954412 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:25.954262 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:30.224287 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:30.224229 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:08:30.224717 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:30.224612 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:35.953944 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:35.953898 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 22 20:08:35.955213 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:35.954047 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:08:35.955213 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:35.954236 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:35.955213 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:35.954318 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:08:40.223949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:40.223902 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:08:40.224342 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:40.224277 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:43.237377 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.237353 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:08:43.276252 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.276214 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/994d1edd-7ea3-4d38-b647-0ba1b9f11f58-kserve-provision-location\") pod \"994d1edd-7ea3-4d38-b647-0ba1b9f11f58\" (UID: \"994d1edd-7ea3-4d38-b647-0ba1b9f11f58\") " Apr 22 20:08:43.276573 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.276549 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994d1edd-7ea3-4d38-b647-0ba1b9f11f58-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "994d1edd-7ea3-4d38-b647-0ba1b9f11f58" (UID: "994d1edd-7ea3-4d38-b647-0ba1b9f11f58"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:08:43.307916 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.307856 2583 generic.go:358] "Generic (PLEG): container finished" podID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerID="a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45" exitCode=0 Apr 22 20:08:43.308101 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.307930 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" event={"ID":"994d1edd-7ea3-4d38-b647-0ba1b9f11f58","Type":"ContainerDied","Data":"a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45"} Apr 22 20:08:43.308101 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.307965 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" event={"ID":"994d1edd-7ea3-4d38-b647-0ba1b9f11f58","Type":"ContainerDied","Data":"51dea5d8cca2a3d7764bbe032f12526c325baf514fc79e8df591e8a3e4bad94c"} Apr 22 20:08:43.308101 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.307985 2583 scope.go:117] "RemoveContainer" containerID="a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45" Apr 22 20:08:43.308101 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.307984 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw" Apr 22 20:08:43.316522 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.316500 2583 scope.go:117] "RemoveContainer" containerID="8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0" Apr 22 20:08:43.323653 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.323635 2583 scope.go:117] "RemoveContainer" containerID="862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372" Apr 22 20:08:43.330094 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.330072 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw"] Apr 22 20:08:43.331080 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.331058 2583 scope.go:117] "RemoveContainer" containerID="a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45" Apr 22 20:08:43.331350 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:08:43.331325 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45\": container with ID starting with a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45 not found: ID does not exist" containerID="a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45" Apr 22 20:08:43.331415 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.331354 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45"} err="failed to get container status \"a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45\": rpc error: code = NotFound desc = could not find container \"a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45\": container with ID starting with a1834aa540357180827472e50ee11261fe3d3327691e857938cc3bfd52217b45 not found: ID does not exist" Apr 22 20:08:43.331415 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.331376 2583 scope.go:117] "RemoveContainer" containerID="8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0" Apr 22 20:08:43.331646 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:08:43.331623 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0\": container with ID starting with 8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0 not found: ID does not exist" containerID="8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0" Apr 22 20:08:43.331702 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.331659 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0"} err="failed to get container status \"8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0\": rpc error: code = NotFound desc = could not find container \"8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0\": container with ID starting with 8975804e2c32dc9f5aca8b92224f1cb3c8494048ec26f6baea82365a465d55b0 not found: ID does not exist" Apr 22 20:08:43.331702 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.331683 2583 scope.go:117] "RemoveContainer" containerID="862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372" Apr 22 20:08:43.331944 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:08:43.331926 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372\": container with ID starting with 862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372 not found: ID does not exist" containerID="862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372" Apr 22 20:08:43.332020 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.331946 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372"} err="failed to get container status \"862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372\": rpc error: code = NotFound desc = could not find container \"862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372\": container with ID starting with 862f10fc51d07019e13a9e675c89029ee57842053d7950c45a156859db3b2372 not found: ID does not exist" Apr 22 20:08:43.336854 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.336832 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-675bd5f74c-hzrkw"] Apr 22 20:08:43.377995 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:43.377923 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/994d1edd-7ea3-4d38-b647-0ba1b9f11f58-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:08:44.096257 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:44.096223 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" path="/var/lib/kubelet/pods/994d1edd-7ea3-4d38-b647-0ba1b9f11f58/volumes" Apr 22 20:08:50.224276 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:50.224216 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:08:50.224665 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:08:50.224635 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:00.224040 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:00.223943 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:09:00.224472 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:00.224451 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:10.224722 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:10.224679 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:09:10.225146 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:10.225122 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:20.224039 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:20.223986 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:09:20.224475 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:20.224357 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:30.224618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:30.224585 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:09:30.225068 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:30.224722 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:09:38.236803 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:38.236772 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm"] Apr 22 20:09:38.237240 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:38.237133 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" containerID="cri-o://b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079" gracePeriod=30 Apr 22 20:09:38.237311 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:38.237223 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" containerID="cri-o://37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c" gracePeriod=30 Apr 22 20:09:40.223965 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:40.223911 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:09:40.224434 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:40.224227 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:43.527982 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:43.527945 2583 generic.go:358] "Generic (PLEG): container finished" podID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerID="b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079" exitCode=0 Apr 22 20:09:43.528352 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:43.528020 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" event={"ID":"b14f346a-b1cc-4033-a89d-d868ec5538db","Type":"ContainerDied","Data":"b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079"} Apr 22 20:09:48.320385 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320354 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp"] Apr 22 20:09:48.320804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320744 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" Apr 22 20:09:48.320804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320754 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" Apr 22 20:09:48.320804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320764 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" Apr 22 20:09:48.320804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320771 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" Apr 22 20:09:48.320804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320780 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="storage-initializer" Apr 22 20:09:48.320804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320786 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="storage-initializer" Apr 22 20:09:48.321013 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320874 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="agent" Apr 22 20:09:48.321013 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.320884 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="994d1edd-7ea3-4d38-b647-0ba1b9f11f58" containerName="kserve-container" Apr 22 20:09:48.323063 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.323045 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:09:48.331927 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.331901 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp"] Apr 22 20:09:48.447138 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.447100 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa2be06f-8cee-40bf-a856-a65644f68e34-kserve-provision-location\") pod \"isvc-logger-predictor-5db886b75b-cfjkp\" (UID: \"aa2be06f-8cee-40bf-a856-a65644f68e34\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:09:48.547639 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.547608 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa2be06f-8cee-40bf-a856-a65644f68e34-kserve-provision-location\") pod \"isvc-logger-predictor-5db886b75b-cfjkp\" (UID: \"aa2be06f-8cee-40bf-a856-a65644f68e34\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:09:48.547982 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.547964 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa2be06f-8cee-40bf-a856-a65644f68e34-kserve-provision-location\") pod \"isvc-logger-predictor-5db886b75b-cfjkp\" (UID: \"aa2be06f-8cee-40bf-a856-a65644f68e34\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:09:48.635401 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.635305 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:09:48.757670 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.757642 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp"] Apr 22 20:09:48.760208 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:09:48.760181 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa2be06f_8cee_40bf_a856_a65644f68e34.slice/crio-9925c9e9494921c34dc388c7b2ea827bc9f14674d2b5803ac246423c6010e6c4 WatchSource:0}: Error finding container 9925c9e9494921c34dc388c7b2ea827bc9f14674d2b5803ac246423c6010e6c4: Status 404 returned error can't find the container with id 9925c9e9494921c34dc388c7b2ea827bc9f14674d2b5803ac246423c6010e6c4 Apr 22 20:09:48.762243 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:48.762222 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:09:49.550295 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:49.550256 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" event={"ID":"aa2be06f-8cee-40bf-a856-a65644f68e34","Type":"ContainerStarted","Data":"2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030"} Apr 22 20:09:49.550295 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:49.550296 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" event={"ID":"aa2be06f-8cee-40bf-a856-a65644f68e34","Type":"ContainerStarted","Data":"9925c9e9494921c34dc388c7b2ea827bc9f14674d2b5803ac246423c6010e6c4"} Apr 22 20:09:50.224194 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:50.224154 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:09:50.224503 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:50.224481 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:53.564214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:53.564179 2583 generic.go:358] "Generic (PLEG): container finished" podID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerID="2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030" exitCode=0 Apr 22 20:09:53.564608 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:53.564254 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" event={"ID":"aa2be06f-8cee-40bf-a856-a65644f68e34","Type":"ContainerDied","Data":"2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030"} Apr 22 20:09:54.569786 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:54.569750 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" event={"ID":"aa2be06f-8cee-40bf-a856-a65644f68e34","Type":"ContainerStarted","Data":"6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc"} Apr 22 20:09:54.569786 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:54.569793 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" event={"ID":"aa2be06f-8cee-40bf-a856-a65644f68e34","Type":"ContainerStarted","Data":"a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867"} Apr 22 20:09:54.570286 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:54.570184 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:09:54.570286 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:54.570213 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:09:54.571490 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:54.571462 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:09:54.572199 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:54.572178 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:54.585849 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:54.585799 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podStartSLOduration=6.585786301 podStartE2EDuration="6.585786301s" podCreationTimestamp="2026-04-22 20:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:09:54.584511008 +0000 UTC m=+683.004417798" watchObservedRunningTime="2026-04-22 20:09:54.585786301 +0000 UTC m=+683.005693092" Apr 22 20:09:55.573969 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:55.573925 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:09:55.574396 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:09:55.574367 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:00.224788 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:00.224734 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:5000: connect: connection refused" Apr 22 20:10:00.225289 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:00.224922 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:10:00.225289 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:00.225085 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:00.225289 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:00.225188 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:10:05.574048 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:05.574001 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:10:05.574525 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:05.574502 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:08.392980 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.392956 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:10:08.424074 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.424042 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b14f346a-b1cc-4033-a89d-d868ec5538db-kserve-provision-location\") pod \"b14f346a-b1cc-4033-a89d-d868ec5538db\" (UID: \"b14f346a-b1cc-4033-a89d-d868ec5538db\") " Apr 22 20:10:08.424356 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.424328 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b14f346a-b1cc-4033-a89d-d868ec5538db-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b14f346a-b1cc-4033-a89d-d868ec5538db" (UID: "b14f346a-b1cc-4033-a89d-d868ec5538db"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:08.524826 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.524743 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b14f346a-b1cc-4033-a89d-d868ec5538db-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:10:08.619433 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.619400 2583 generic.go:358] "Generic (PLEG): container finished" podID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerID="37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c" exitCode=0 Apr 22 20:10:08.619598 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.619487 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" Apr 22 20:10:08.619598 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.619482 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" event={"ID":"b14f346a-b1cc-4033-a89d-d868ec5538db","Type":"ContainerDied","Data":"37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c"} Apr 22 20:10:08.619598 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.619587 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm" event={"ID":"b14f346a-b1cc-4033-a89d-d868ec5538db","Type":"ContainerDied","Data":"f66bd2dae0ad75d883cc0ecdf72fcbf09c1563a78c2e43993f5182bac421faaa"} Apr 22 20:10:08.619728 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.619606 2583 scope.go:117] "RemoveContainer" containerID="37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c" Apr 22 20:10:08.627840 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.627823 2583 scope.go:117] "RemoveContainer" containerID="b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079" Apr 22 20:10:08.635480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.635462 2583 scope.go:117] "RemoveContainer" containerID="a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b" Apr 22 20:10:08.641548 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.641525 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm"] Apr 22 20:10:08.643589 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.643570 2583 scope.go:117] "RemoveContainer" containerID="37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c" Apr 22 20:10:08.644053 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:10:08.644033 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c\": container with ID starting with 37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c not found: ID does not exist" containerID="37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c" Apr 22 20:10:08.644151 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.644061 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c"} err="failed to get container status \"37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c\": rpc error: code = NotFound desc = could not find container \"37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c\": container with ID starting with 37b12ca5d8981a153301ba1f2489bd4edd294abc7154a4ead7e394ad6e37315c not found: ID does not exist" Apr 22 20:10:08.644151 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.644085 2583 scope.go:117] "RemoveContainer" containerID="b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079" Apr 22 20:10:08.644348 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:10:08.644323 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079\": container with ID starting with b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079 not found: ID does not exist" containerID="b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079" Apr 22 20:10:08.644402 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.644358 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079"} err="failed to get container status \"b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079\": rpc error: code = NotFound desc = could not find container \"b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079\": container with ID starting with b8471c3c7cd080205c2da8ace01a58efc46fc348580b50c63af78274e0677079 not found: ID does not exist" Apr 22 20:10:08.644402 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.644383 2583 scope.go:117] "RemoveContainer" containerID="a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b" Apr 22 20:10:08.644656 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:10:08.644638 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b\": container with ID starting with a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b not found: ID does not exist" containerID="a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b" Apr 22 20:10:08.644701 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.644661 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b"} err="failed to get container status \"a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b\": rpc error: code = NotFound desc = could not find container \"a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b\": container with ID starting with a5381db7ac9ab7f5de5bd56a500aa8d5aa529a9824398f09ca3fd3e9e55be78b not found: ID does not exist" Apr 22 20:10:08.645470 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:08.645454 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-7f9dddd76d-ttfmm"] Apr 22 20:10:10.096534 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:10.096500 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" path="/var/lib/kubelet/pods/b14f346a-b1cc-4033-a89d-d868ec5538db/volumes" Apr 22 20:10:15.574231 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:15.574177 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:10:15.574660 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:15.574633 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:25.574790 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:25.574688 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:10:25.575271 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:25.575177 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:35.574214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:35.574162 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:10:35.574639 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:35.574586 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:45.574541 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:45.574494 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:10:45.575026 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:45.574946 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:55.574889 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:55.574806 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:10:55.575363 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:10:55.575339 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:05.574723 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:05.574690 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:11:05.575262 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:05.574982 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:11:13.521526 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.521490 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp"] Apr 22 20:11:13.521941 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.521791 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" containerID="cri-o://a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867" gracePeriod=30 Apr 22 20:11:13.522015 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.521927 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" containerID="cri-o://6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc" gracePeriod=30 Apr 22 20:11:13.558945 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.558916 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8"] Apr 22 20:11:13.559314 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.559302 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="storage-initializer" Apr 22 20:11:13.559373 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.559316 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="storage-initializer" Apr 22 20:11:13.559373 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.559326 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" Apr 22 20:11:13.559373 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.559332 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" Apr 22 20:11:13.559373 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.559340 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" Apr 22 20:11:13.559373 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.559345 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" Apr 22 20:11:13.559559 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.559407 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="kserve-container" Apr 22 20:11:13.559559 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.559416 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b14f346a-b1cc-4033-a89d-d868ec5538db" containerName="agent" Apr 22 20:11:13.563221 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.563199 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:11:13.569837 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.569812 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8"] Apr 22 20:11:13.702826 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.702787 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de53dedc-40f8-4155-b015-18bbe23dc056-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-lz6x8\" (UID: \"de53dedc-40f8-4155-b015-18bbe23dc056\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:11:13.803738 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.803641 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de53dedc-40f8-4155-b015-18bbe23dc056-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-lz6x8\" (UID: \"de53dedc-40f8-4155-b015-18bbe23dc056\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:11:13.804110 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.804087 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de53dedc-40f8-4155-b015-18bbe23dc056-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-lz6x8\" (UID: \"de53dedc-40f8-4155-b015-18bbe23dc056\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:11:13.875770 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:13.875738 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:11:14.007776 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:14.007736 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8"] Apr 22 20:11:14.010659 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:11:14.010630 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde53dedc_40f8_4155_b015_18bbe23dc056.slice/crio-6039245a18bb76fef1eb3c67255c6047fe58a5ce1e0fbf286e9406538983ff0c WatchSource:0}: Error finding container 6039245a18bb76fef1eb3c67255c6047fe58a5ce1e0fbf286e9406538983ff0c: Status 404 returned error can't find the container with id 6039245a18bb76fef1eb3c67255c6047fe58a5ce1e0fbf286e9406538983ff0c Apr 22 20:11:14.857218 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:14.857182 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" event={"ID":"de53dedc-40f8-4155-b015-18bbe23dc056","Type":"ContainerStarted","Data":"3bb1ca22691f6469df203f9027000a4e8517dbceeb1baa050c24348f803d14f8"} Apr 22 20:11:14.857218 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:14.857222 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" event={"ID":"de53dedc-40f8-4155-b015-18bbe23dc056","Type":"ContainerStarted","Data":"6039245a18bb76fef1eb3c67255c6047fe58a5ce1e0fbf286e9406538983ff0c"} Apr 22 20:11:15.574055 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:15.573999 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:11:15.574391 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:15.574362 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:17.868924 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:17.868879 2583 generic.go:358] "Generic (PLEG): container finished" podID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerID="a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867" exitCode=0 Apr 22 20:11:17.869320 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:17.868962 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" event={"ID":"aa2be06f-8cee-40bf-a856-a65644f68e34","Type":"ContainerDied","Data":"a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867"} Apr 22 20:11:18.874376 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:18.874336 2583 generic.go:358] "Generic (PLEG): container finished" podID="de53dedc-40f8-4155-b015-18bbe23dc056" containerID="3bb1ca22691f6469df203f9027000a4e8517dbceeb1baa050c24348f803d14f8" exitCode=0 Apr 22 20:11:18.874906 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:18.874414 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" event={"ID":"de53dedc-40f8-4155-b015-18bbe23dc056","Type":"ContainerDied","Data":"3bb1ca22691f6469df203f9027000a4e8517dbceeb1baa050c24348f803d14f8"} Apr 22 20:11:25.574292 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:25.574240 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:11:25.574757 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:25.574610 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:25.903663 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:25.903633 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" event={"ID":"de53dedc-40f8-4155-b015-18bbe23dc056","Type":"ContainerStarted","Data":"e244988b72c772825256f16a5b69aa23a372495a6c91da8a5fd64f5ec34999c7"} Apr 22 20:11:25.903931 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:25.903914 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:11:25.905184 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:25.905157 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:11:25.919363 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:25.919311 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podStartSLOduration=5.985541153 podStartE2EDuration="12.919296812s" podCreationTimestamp="2026-04-22 20:11:13 +0000 UTC" firstStartedPulling="2026-04-22 20:11:18.875778597 +0000 UTC m=+767.295685366" lastFinishedPulling="2026-04-22 20:11:25.809534253 +0000 UTC m=+774.229441025" observedRunningTime="2026-04-22 20:11:25.917488984 +0000 UTC m=+774.337395776" watchObservedRunningTime="2026-04-22 20:11:25.919296812 +0000 UTC m=+774.339203603" Apr 22 20:11:26.907828 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:26.907788 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:11:35.574800 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:35.574748 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 22 20:11:35.575296 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:35.574963 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:11:35.575296 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:35.575119 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:35.575296 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:35.575229 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:11:36.907979 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:36.907936 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:11:43.669591 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.669562 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:11:43.781913 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.781817 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa2be06f-8cee-40bf-a856-a65644f68e34-kserve-provision-location\") pod \"aa2be06f-8cee-40bf-a856-a65644f68e34\" (UID: \"aa2be06f-8cee-40bf-a856-a65644f68e34\") " Apr 22 20:11:43.782133 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.782108 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa2be06f-8cee-40bf-a856-a65644f68e34-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aa2be06f-8cee-40bf-a856-a65644f68e34" (UID: "aa2be06f-8cee-40bf-a856-a65644f68e34"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:11:43.782242 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.782224 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa2be06f-8cee-40bf-a856-a65644f68e34-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:11:43.968079 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.968038 2583 generic.go:358] "Generic (PLEG): container finished" podID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerID="6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc" exitCode=137 Apr 22 20:11:43.968261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.968126 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" Apr 22 20:11:43.968261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.968124 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" event={"ID":"aa2be06f-8cee-40bf-a856-a65644f68e34","Type":"ContainerDied","Data":"6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc"} Apr 22 20:11:43.968261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.968175 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp" event={"ID":"aa2be06f-8cee-40bf-a856-a65644f68e34","Type":"ContainerDied","Data":"9925c9e9494921c34dc388c7b2ea827bc9f14674d2b5803ac246423c6010e6c4"} Apr 22 20:11:43.968261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.968198 2583 scope.go:117] "RemoveContainer" containerID="6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc" Apr 22 20:11:43.978516 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.978498 2583 scope.go:117] "RemoveContainer" containerID="a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867" Apr 22 20:11:43.986993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.986973 2583 scope.go:117] "RemoveContainer" containerID="2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030" Apr 22 20:11:43.993486 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.993454 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp"] Apr 22 20:11:43.996170 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.996145 2583 scope.go:117] "RemoveContainer" containerID="6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc" Apr 22 20:11:43.996634 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:11:43.996470 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc\": container with ID starting with 6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc not found: ID does not exist" containerID="6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc" Apr 22 20:11:43.996634 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.996568 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc"} err="failed to get container status \"6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc\": rpc error: code = NotFound desc = could not find container \"6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc\": container with ID starting with 6dc13323c2ecbf6a42a6f1d6a2156d940cd46665ebe527ea36dfdd7c71587efc not found: ID does not exist" Apr 22 20:11:43.996634 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.996596 2583 scope.go:117] "RemoveContainer" containerID="a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867" Apr 22 20:11:43.997267 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:11:43.997242 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867\": container with ID starting with a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867 not found: ID does not exist" containerID="a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867" Apr 22 20:11:43.997355 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.997278 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867"} err="failed to get container status \"a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867\": rpc error: code = NotFound desc = could not find container \"a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867\": container with ID starting with a8ebffcd8967da9e74837ce85c5ca0b5ac5c80036c22756a9a97e82c1675b867 not found: ID does not exist" Apr 22 20:11:43.997355 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.997303 2583 scope.go:117] "RemoveContainer" containerID="2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030" Apr 22 20:11:43.997583 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:11:43.997562 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030\": container with ID starting with 2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030 not found: ID does not exist" containerID="2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030" Apr 22 20:11:43.997666 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.997595 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030"} err="failed to get container status \"2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030\": rpc error: code = NotFound desc = could not find container \"2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030\": container with ID starting with 2293b4d02d05228bd4351684ccdc2d121ba4554bf1a0cd37b5a814ad7ae75030 not found: ID does not exist" Apr 22 20:11:43.999972 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:43.999951 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5db886b75b-cfjkp"] Apr 22 20:11:44.097087 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:44.097007 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" path="/var/lib/kubelet/pods/aa2be06f-8cee-40bf-a856-a65644f68e34/volumes" Apr 22 20:11:46.908023 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:46.907979 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:11:56.908813 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:11:56.908718 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:12:06.908604 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:06.908557 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:12:16.908127 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:16.908081 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:12:26.908207 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:26.908160 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:12:36.908063 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:36.908014 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:12:44.097297 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:44.097271 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:12:53.702208 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.702158 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8"] Apr 22 20:12:53.702775 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.702566 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" containerID="cri-o://e244988b72c772825256f16a5b69aa23a372495a6c91da8a5fd64f5ec34999c7" gracePeriod=30 Apr 22 20:12:53.773572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.773540 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq"] Apr 22 20:12:53.773970 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.773943 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" Apr 22 20:12:53.773970 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.773960 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" Apr 22 20:12:53.774214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.773977 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" Apr 22 20:12:53.774214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.773986 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" Apr 22 20:12:53.774214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.774017 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="storage-initializer" Apr 22 20:12:53.774214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.774024 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="storage-initializer" Apr 22 20:12:53.774214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.774096 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="kserve-container" Apr 22 20:12:53.774214 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.774107 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa2be06f-8cee-40bf-a856-a65644f68e34" containerName="agent" Apr 22 20:12:53.778461 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.778441 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:12:53.784548 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.784518 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq"] Apr 22 20:12:53.904013 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:53.903981 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf3140b2-da98-4446-821a-68c6956fa1d9-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq\" (UID: \"bf3140b2-da98-4446-821a-68c6956fa1d9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:12:54.005531 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:54.005440 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf3140b2-da98-4446-821a-68c6956fa1d9-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq\" (UID: \"bf3140b2-da98-4446-821a-68c6956fa1d9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:12:54.005900 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:54.005848 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf3140b2-da98-4446-821a-68c6956fa1d9-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq\" (UID: \"bf3140b2-da98-4446-821a-68c6956fa1d9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:12:54.090063 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:54.090032 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:12:54.093218 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:54.093188 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 22 20:12:54.218315 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:54.218290 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq"] Apr 22 20:12:54.220507 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:12:54.220475 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf3140b2_da98_4446_821a_68c6956fa1d9.slice/crio-316819e07590986fe98f8ec74e168754e452a6efee5c4467080f73bb3a520e24 WatchSource:0}: Error finding container 316819e07590986fe98f8ec74e168754e452a6efee5c4467080f73bb3a520e24: Status 404 returned error can't find the container with id 316819e07590986fe98f8ec74e168754e452a6efee5c4467080f73bb3a520e24 Apr 22 20:12:55.214605 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:55.214564 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" event={"ID":"bf3140b2-da98-4446-821a-68c6956fa1d9","Type":"ContainerStarted","Data":"ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c"} Apr 22 20:12:55.215015 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:55.214612 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" event={"ID":"bf3140b2-da98-4446-821a-68c6956fa1d9","Type":"ContainerStarted","Data":"316819e07590986fe98f8ec74e168754e452a6efee5c4467080f73bb3a520e24"} Apr 22 20:12:58.225795 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:58.225760 2583 generic.go:358] "Generic (PLEG): container finished" podID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerID="ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c" exitCode=0 Apr 22 20:12:58.226181 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:58.225831 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" event={"ID":"bf3140b2-da98-4446-821a-68c6956fa1d9","Type":"ContainerDied","Data":"ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c"} Apr 22 20:12:59.231915 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.231887 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" event={"ID":"bf3140b2-da98-4446-821a-68c6956fa1d9","Type":"ContainerStarted","Data":"7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e"} Apr 22 20:12:59.232295 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.232234 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:12:59.233438 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.233399 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:12:59.233557 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.233496 2583 generic.go:358] "Generic (PLEG): container finished" podID="de53dedc-40f8-4155-b015-18bbe23dc056" containerID="e244988b72c772825256f16a5b69aa23a372495a6c91da8a5fd64f5ec34999c7" exitCode=0 Apr 22 20:12:59.233557 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.233535 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" event={"ID":"de53dedc-40f8-4155-b015-18bbe23dc056","Type":"ContainerDied","Data":"e244988b72c772825256f16a5b69aa23a372495a6c91da8a5fd64f5ec34999c7"} Apr 22 20:12:59.233557 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.233555 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" event={"ID":"de53dedc-40f8-4155-b015-18bbe23dc056","Type":"ContainerDied","Data":"6039245a18bb76fef1eb3c67255c6047fe58a5ce1e0fbf286e9406538983ff0c"} Apr 22 20:12:59.233720 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.233565 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6039245a18bb76fef1eb3c67255c6047fe58a5ce1e0fbf286e9406538983ff0c" Apr 22 20:12:59.242812 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.242796 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:12:59.249261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.249094 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de53dedc-40f8-4155-b015-18bbe23dc056-kserve-provision-location\") pod \"de53dedc-40f8-4155-b015-18bbe23dc056\" (UID: \"de53dedc-40f8-4155-b015-18bbe23dc056\") " Apr 22 20:12:59.249629 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.249516 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de53dedc-40f8-4155-b015-18bbe23dc056-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de53dedc-40f8-4155-b015-18bbe23dc056" (UID: "de53dedc-40f8-4155-b015-18bbe23dc056"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:12:59.250808 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.250766 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podStartSLOduration=6.250754989 podStartE2EDuration="6.250754989s" podCreationTimestamp="2026-04-22 20:12:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:12:59.248745611 +0000 UTC m=+867.668652404" watchObservedRunningTime="2026-04-22 20:12:59.250754989 +0000 UTC m=+867.670661780" Apr 22 20:12:59.349804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:12:59.349763 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de53dedc-40f8-4155-b015-18bbe23dc056-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:13:00.236851 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:00.236823 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8" Apr 22 20:13:00.237236 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:00.237072 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:13:00.253624 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:00.253588 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8"] Apr 22 20:13:00.257562 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:00.257536 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-lz6x8"] Apr 22 20:13:02.097302 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:02.097268 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" path="/var/lib/kubelet/pods/de53dedc-40f8-4155-b015-18bbe23dc056/volumes" Apr 22 20:13:10.237448 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:10.237403 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:13:20.237817 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:20.237772 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:13:30.237213 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:30.237173 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:13:40.238172 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:40.238122 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:13:50.237941 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:13:50.237901 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:14:00.237392 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:00.237348 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:14:10.237505 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:10.237456 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.42:8080: connect: connection refused" Apr 22 20:14:14.097938 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:14.097909 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:14:24.457632 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.457584 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq"] Apr 22 20:14:24.458190 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.457951 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" containerID="cri-o://7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e" gracePeriod=30 Apr 22 20:14:24.573968 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.573928 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd"] Apr 22 20:14:24.574326 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.574310 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="storage-initializer" Apr 22 20:14:24.574326 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.574327 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="storage-initializer" Apr 22 20:14:24.574432 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.574348 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" Apr 22 20:14:24.574432 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.574354 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" Apr 22 20:14:24.574432 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.574414 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="de53dedc-40f8-4155-b015-18bbe23dc056" containerName="kserve-container" Apr 22 20:14:24.577648 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.577627 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:14:24.585931 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.585903 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd"] Apr 22 20:14:24.688503 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.688462 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a17741-1a2e-4f5f-b054-315dbafc71c2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd\" (UID: \"67a17741-1a2e-4f5f-b054-315dbafc71c2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:14:24.789563 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.789458 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a17741-1a2e-4f5f-b054-315dbafc71c2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd\" (UID: \"67a17741-1a2e-4f5f-b054-315dbafc71c2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:14:24.789854 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.789833 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a17741-1a2e-4f5f-b054-315dbafc71c2-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd\" (UID: \"67a17741-1a2e-4f5f-b054-315dbafc71c2\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:14:24.888700 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:24.888660 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:14:25.013189 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:25.013165 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd"] Apr 22 20:14:25.015982 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:14:25.015946 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67a17741_1a2e_4f5f_b054_315dbafc71c2.slice/crio-0b0b369ed7188112b3f542a642b999a954ca25dc15b5339c6f7b77bc71cb255b WatchSource:0}: Error finding container 0b0b369ed7188112b3f542a642b999a954ca25dc15b5339c6f7b77bc71cb255b: Status 404 returned error can't find the container with id 0b0b369ed7188112b3f542a642b999a954ca25dc15b5339c6f7b77bc71cb255b Apr 22 20:14:25.538711 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:25.538671 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" event={"ID":"67a17741-1a2e-4f5f-b054-315dbafc71c2","Type":"ContainerStarted","Data":"2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6"} Apr 22 20:14:25.538711 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:25.538719 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" event={"ID":"67a17741-1a2e-4f5f-b054-315dbafc71c2","Type":"ContainerStarted","Data":"0b0b369ed7188112b3f542a642b999a954ca25dc15b5339c6f7b77bc71cb255b"} Apr 22 20:14:29.553754 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:29.553719 2583 generic.go:358] "Generic (PLEG): container finished" podID="67a17741-1a2e-4f5f-b054-315dbafc71c2" containerID="2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6" exitCode=0 Apr 22 20:14:29.554247 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:29.553801 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" event={"ID":"67a17741-1a2e-4f5f-b054-315dbafc71c2","Type":"ContainerDied","Data":"2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6"} Apr 22 20:14:29.907358 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:29.907334 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:14:30.038223 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.037967 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf3140b2-da98-4446-821a-68c6956fa1d9-kserve-provision-location\") pod \"bf3140b2-da98-4446-821a-68c6956fa1d9\" (UID: \"bf3140b2-da98-4446-821a-68c6956fa1d9\") " Apr 22 20:14:30.038580 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.038553 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3140b2-da98-4446-821a-68c6956fa1d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bf3140b2-da98-4446-821a-68c6956fa1d9" (UID: "bf3140b2-da98-4446-821a-68c6956fa1d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:30.141039 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.141001 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf3140b2-da98-4446-821a-68c6956fa1d9-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:14:30.561016 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.560977 2583 generic.go:358] "Generic (PLEG): container finished" podID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerID="7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e" exitCode=0 Apr 22 20:14:30.561510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.561144 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" event={"ID":"bf3140b2-da98-4446-821a-68c6956fa1d9","Type":"ContainerDied","Data":"7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e"} Apr 22 20:14:30.561510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.561172 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" event={"ID":"bf3140b2-da98-4446-821a-68c6956fa1d9","Type":"ContainerDied","Data":"316819e07590986fe98f8ec74e168754e452a6efee5c4467080f73bb3a520e24"} Apr 22 20:14:30.561510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.561194 2583 scope.go:117] "RemoveContainer" containerID="7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e" Apr 22 20:14:30.561510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.561346 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq" Apr 22 20:14:30.578522 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.578462 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq"] Apr 22 20:14:30.582146 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.582121 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-lpcwq"] Apr 22 20:14:30.593086 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.592973 2583 scope.go:117] "RemoveContainer" containerID="ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c" Apr 22 20:14:30.609805 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.609783 2583 scope.go:117] "RemoveContainer" containerID="7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e" Apr 22 20:14:30.611058 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:14:30.611029 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e\": container with ID starting with 7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e not found: ID does not exist" containerID="7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e" Apr 22 20:14:30.611230 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.611081 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e"} err="failed to get container status \"7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e\": rpc error: code = NotFound desc = could not find container \"7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e\": container with ID starting with 7386b67547d9c3211b884888793473b2ba06b22e380b1e018669b1a7b086224e not found: ID does not exist" Apr 22 20:14:30.611230 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.611110 2583 scope.go:117] "RemoveContainer" containerID="ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c" Apr 22 20:14:30.611584 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:14:30.611503 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c\": container with ID starting with ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c not found: ID does not exist" containerID="ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c" Apr 22 20:14:30.611584 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:30.611536 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c"} err="failed to get container status \"ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c\": rpc error: code = NotFound desc = could not find container \"ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c\": container with ID starting with ad7f1a335b9a549a85fa381485fe22d8fcc36af0d2e20085e663f802a229493c not found: ID does not exist" Apr 22 20:14:32.106489 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:14:32.106454 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" path="/var/lib/kubelet/pods/bf3140b2-da98-4446-821a-68c6956fa1d9/volumes" Apr 22 20:16:48.101057 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:16:48.101020 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" event={"ID":"67a17741-1a2e-4f5f-b054-315dbafc71c2","Type":"ContainerStarted","Data":"deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639"} Apr 22 20:16:48.101557 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:16:48.101137 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:16:48.126631 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:16:48.126583 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" podStartSLOduration=6.462856958 podStartE2EDuration="2m24.126568074s" podCreationTimestamp="2026-04-22 20:14:24 +0000 UTC" firstStartedPulling="2026-04-22 20:14:29.554920051 +0000 UTC m=+957.974826820" lastFinishedPulling="2026-04-22 20:16:47.218631159 +0000 UTC m=+1095.638537936" observedRunningTime="2026-04-22 20:16:48.123717911 +0000 UTC m=+1096.543624696" watchObservedRunningTime="2026-04-22 20:16:48.126568074 +0000 UTC m=+1096.546474865" Apr 22 20:17:19.111855 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:19.111824 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:17:24.770574 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.770528 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd"] Apr 22 20:17:24.771021 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.770791 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" podUID="67a17741-1a2e-4f5f-b054-315dbafc71c2" containerName="kserve-container" containerID="cri-o://deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639" gracePeriod=30 Apr 22 20:17:24.857911 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.857854 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd"] Apr 22 20:17:24.858452 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.858428 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" Apr 22 20:17:24.858571 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.858454 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" Apr 22 20:17:24.858571 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.858493 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="storage-initializer" Apr 22 20:17:24.858571 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.858503 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="storage-initializer" Apr 22 20:17:24.858727 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.858621 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf3140b2-da98-4446-821a-68c6956fa1d9" containerName="kserve-container" Apr 22 20:17:24.861133 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.861106 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:24.868168 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.868142 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd"] Apr 22 20:17:24.931781 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:24.931740 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f037d4-e36d-4cbe-95ed-9eebd399e933-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd\" (UID: \"d9f037d4-e36d-4cbe-95ed-9eebd399e933\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:25.032367 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:25.032257 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f037d4-e36d-4cbe-95ed-9eebd399e933-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd\" (UID: \"d9f037d4-e36d-4cbe-95ed-9eebd399e933\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:25.032672 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:25.032649 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f037d4-e36d-4cbe-95ed-9eebd399e933-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd\" (UID: \"d9f037d4-e36d-4cbe-95ed-9eebd399e933\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:25.174701 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:25.174649 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:25.364192 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:25.364143 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd"] Apr 22 20:17:25.367558 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:17:25.367523 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f037d4_e36d_4cbe_95ed_9eebd399e933.slice/crio-ea3c9395fa11052d491458a9ea4791d391552709021bb692b063b8fc88b1978d WatchSource:0}: Error finding container ea3c9395fa11052d491458a9ea4791d391552709021bb692b063b8fc88b1978d: Status 404 returned error can't find the container with id ea3c9395fa11052d491458a9ea4791d391552709021bb692b063b8fc88b1978d Apr 22 20:17:25.369704 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:25.369684 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:17:25.905000 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:25.904975 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:17:26.040208 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.040126 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a17741-1a2e-4f5f-b054-315dbafc71c2-kserve-provision-location\") pod \"67a17741-1a2e-4f5f-b054-315dbafc71c2\" (UID: \"67a17741-1a2e-4f5f-b054-315dbafc71c2\") " Apr 22 20:17:26.040459 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.040435 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a17741-1a2e-4f5f-b054-315dbafc71c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "67a17741-1a2e-4f5f-b054-315dbafc71c2" (UID: "67a17741-1a2e-4f5f-b054-315dbafc71c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:17:26.141124 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.141089 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a17741-1a2e-4f5f-b054-315dbafc71c2-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:17:26.238719 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.238680 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" event={"ID":"d9f037d4-e36d-4cbe-95ed-9eebd399e933","Type":"ContainerStarted","Data":"e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1"} Apr 22 20:17:26.238719 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.238721 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" event={"ID":"d9f037d4-e36d-4cbe-95ed-9eebd399e933","Type":"ContainerStarted","Data":"ea3c9395fa11052d491458a9ea4791d391552709021bb692b063b8fc88b1978d"} Apr 22 20:17:26.240199 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.240172 2583 generic.go:358] "Generic (PLEG): container finished" podID="67a17741-1a2e-4f5f-b054-315dbafc71c2" containerID="deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639" exitCode=0 Apr 22 20:17:26.240315 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.240234 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" event={"ID":"67a17741-1a2e-4f5f-b054-315dbafc71c2","Type":"ContainerDied","Data":"deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639"} Apr 22 20:17:26.240315 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.240255 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" event={"ID":"67a17741-1a2e-4f5f-b054-315dbafc71c2","Type":"ContainerDied","Data":"0b0b369ed7188112b3f542a642b999a954ca25dc15b5339c6f7b77bc71cb255b"} Apr 22 20:17:26.240315 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.240273 2583 scope.go:117] "RemoveContainer" containerID="deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639" Apr 22 20:17:26.240315 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.240236 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd" Apr 22 20:17:26.248496 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.248480 2583 scope.go:117] "RemoveContainer" containerID="2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6" Apr 22 20:17:26.257239 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.257222 2583 scope.go:117] "RemoveContainer" containerID="deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639" Apr 22 20:17:26.257498 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:17:26.257478 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639\": container with ID starting with deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639 not found: ID does not exist" containerID="deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639" Apr 22 20:17:26.257542 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.257509 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639"} err="failed to get container status \"deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639\": rpc error: code = NotFound desc = could not find container \"deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639\": container with ID starting with deb6babe25a42b2a9f2d607e4537cc49897de1f7d009f47ff42a36330ac89639 not found: ID does not exist" Apr 22 20:17:26.257542 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.257529 2583 scope.go:117] "RemoveContainer" containerID="2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6" Apr 22 20:17:26.257775 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:17:26.257751 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6\": container with ID starting with 2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6 not found: ID does not exist" containerID="2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6" Apr 22 20:17:26.257895 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.257787 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6"} err="failed to get container status \"2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6\": rpc error: code = NotFound desc = could not find container \"2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6\": container with ID starting with 2c078624e6bc9ac9c91380a57f02b00473f090d515d8457d78e406ab9b1194b6 not found: ID does not exist" Apr 22 20:17:26.264241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.264216 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd"] Apr 22 20:17:26.270173 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:26.270151 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-8skmd"] Apr 22 20:17:28.096369 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:28.096334 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a17741-1a2e-4f5f-b054-315dbafc71c2" path="/var/lib/kubelet/pods/67a17741-1a2e-4f5f-b054-315dbafc71c2/volumes" Apr 22 20:17:29.258257 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:29.258203 2583 generic.go:358] "Generic (PLEG): container finished" podID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerID="e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1" exitCode=0 Apr 22 20:17:29.258614 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:29.258304 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" event={"ID":"d9f037d4-e36d-4cbe-95ed-9eebd399e933","Type":"ContainerDied","Data":"e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1"} Apr 22 20:17:30.263408 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:30.263371 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" event={"ID":"d9f037d4-e36d-4cbe-95ed-9eebd399e933","Type":"ContainerStarted","Data":"54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79"} Apr 22 20:17:30.263899 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:30.263684 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:30.265242 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:30.265211 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 22 20:17:30.278361 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:30.278310 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" podStartSLOduration=6.278296396 podStartE2EDuration="6.278296396s" podCreationTimestamp="2026-04-22 20:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:17:30.277892998 +0000 UTC m=+1138.697799791" watchObservedRunningTime="2026-04-22 20:17:30.278296396 +0000 UTC m=+1138.698203190" Apr 22 20:17:31.268572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:31.268535 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 22 20:17:32.110516 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:32.110481 2583 scope.go:117] "RemoveContainer" containerID="e244988b72c772825256f16a5b69aa23a372495a6c91da8a5fd64f5ec34999c7" Apr 22 20:17:32.118500 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:32.118478 2583 scope.go:117] "RemoveContainer" containerID="3bb1ca22691f6469df203f9027000a4e8517dbceeb1baa050c24348f803d14f8" Apr 22 20:17:41.269970 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:41.269932 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:44.972085 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.972050 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf"] Apr 22 20:17:44.972633 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.972618 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a17741-1a2e-4f5f-b054-315dbafc71c2" containerName="storage-initializer" Apr 22 20:17:44.972684 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.972636 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a17741-1a2e-4f5f-b054-315dbafc71c2" containerName="storage-initializer" Apr 22 20:17:44.972684 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.972673 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a17741-1a2e-4f5f-b054-315dbafc71c2" containerName="kserve-container" Apr 22 20:17:44.972684 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.972682 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a17741-1a2e-4f5f-b054-315dbafc71c2" containerName="kserve-container" Apr 22 20:17:44.972785 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.972767 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="67a17741-1a2e-4f5f-b054-315dbafc71c2" containerName="kserve-container" Apr 22 20:17:44.977593 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.977574 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:17:44.983287 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.983252 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf"] Apr 22 20:17:44.995944 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.995911 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd"] Apr 22 20:17:44.996332 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:44.996260 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerName="kserve-container" containerID="cri-o://54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79" gracePeriod=30 Apr 22 20:17:45.010161 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.010131 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac14c4-633d-4f6b-907e-e24206128738-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf\" (UID: \"f1ac14c4-633d-4f6b-907e-e24206128738\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:17:45.110735 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.110693 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac14c4-633d-4f6b-907e-e24206128738-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf\" (UID: \"f1ac14c4-633d-4f6b-907e-e24206128738\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:17:45.111065 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.111039 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac14c4-633d-4f6b-907e-e24206128738-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf\" (UID: \"f1ac14c4-633d-4f6b-907e-e24206128738\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:17:45.289832 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.289720 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:17:45.446351 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.446325 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf"] Apr 22 20:17:45.448629 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:17:45.448598 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ac14c4_633d_4f6b_907e_e24206128738.slice/crio-7b0fde2393af24f8a96d6c840a50542f9afccc4a953364e5f198898399b0a081 WatchSource:0}: Error finding container 7b0fde2393af24f8a96d6c840a50542f9afccc4a953364e5f198898399b0a081: Status 404 returned error can't find the container with id 7b0fde2393af24f8a96d6c840a50542f9afccc4a953364e5f198898399b0a081 Apr 22 20:17:45.673737 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.673714 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:45.715531 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.715499 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f037d4-e36d-4cbe-95ed-9eebd399e933-kserve-provision-location\") pod \"d9f037d4-e36d-4cbe-95ed-9eebd399e933\" (UID: \"d9f037d4-e36d-4cbe-95ed-9eebd399e933\") " Apr 22 20:17:45.715844 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.715816 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f037d4-e36d-4cbe-95ed-9eebd399e933-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d9f037d4-e36d-4cbe-95ed-9eebd399e933" (UID: "d9f037d4-e36d-4cbe-95ed-9eebd399e933"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:17:45.816189 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:45.816159 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9f037d4-e36d-4cbe-95ed-9eebd399e933-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:17:46.333654 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.333594 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" event={"ID":"f1ac14c4-633d-4f6b-907e-e24206128738","Type":"ContainerStarted","Data":"3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec"} Apr 22 20:17:46.334145 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.333703 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" event={"ID":"f1ac14c4-633d-4f6b-907e-e24206128738","Type":"ContainerStarted","Data":"7b0fde2393af24f8a96d6c840a50542f9afccc4a953364e5f198898399b0a081"} Apr 22 20:17:46.335307 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.335277 2583 generic.go:358] "Generic (PLEG): container finished" podID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerID="54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79" exitCode=0 Apr 22 20:17:46.335428 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.335354 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" event={"ID":"d9f037d4-e36d-4cbe-95ed-9eebd399e933","Type":"ContainerDied","Data":"54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79"} Apr 22 20:17:46.335428 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.335384 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" event={"ID":"d9f037d4-e36d-4cbe-95ed-9eebd399e933","Type":"ContainerDied","Data":"ea3c9395fa11052d491458a9ea4791d391552709021bb692b063b8fc88b1978d"} Apr 22 20:17:46.335428 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.335387 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd" Apr 22 20:17:46.335569 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.335399 2583 scope.go:117] "RemoveContainer" containerID="54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79" Apr 22 20:17:46.344108 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.344080 2583 scope.go:117] "RemoveContainer" containerID="e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1" Apr 22 20:17:46.352345 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.352327 2583 scope.go:117] "RemoveContainer" containerID="54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79" Apr 22 20:17:46.352624 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:17:46.352601 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79\": container with ID starting with 54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79 not found: ID does not exist" containerID="54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79" Apr 22 20:17:46.352711 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.352632 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79"} err="failed to get container status \"54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79\": rpc error: code = NotFound desc = could not find container \"54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79\": container with ID starting with 54912f88f8c376f78a0bf4107356a69bbcd63dcc8cf41f03e9b623f658e2ae79 not found: ID does not exist" Apr 22 20:17:46.352711 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.352651 2583 scope.go:117] "RemoveContainer" containerID="e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1" Apr 22 20:17:46.352973 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:17:46.352937 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1\": container with ID starting with e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1 not found: ID does not exist" containerID="e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1" Apr 22 20:17:46.353045 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.352977 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1"} err="failed to get container status \"e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1\": rpc error: code = NotFound desc = could not find container \"e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1\": container with ID starting with e9944e552958c48b29cf73b88de8f49b5f44d3c273747a74f9c864e0471d69b1 not found: ID does not exist" Apr 22 20:17:46.362576 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.362522 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd"] Apr 22 20:17:46.366434 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:46.366413 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-zfwpd"] Apr 22 20:17:48.096324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:48.096293 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" path="/var/lib/kubelet/pods/d9f037d4-e36d-4cbe-95ed-9eebd399e933/volumes" Apr 22 20:17:50.351819 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:50.351786 2583 generic.go:358] "Generic (PLEG): container finished" podID="f1ac14c4-633d-4f6b-907e-e24206128738" containerID="3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec" exitCode=0 Apr 22 20:17:50.351819 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:50.351823 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" event={"ID":"f1ac14c4-633d-4f6b-907e-e24206128738","Type":"ContainerDied","Data":"3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec"} Apr 22 20:17:51.357136 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:51.357049 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" event={"ID":"f1ac14c4-633d-4f6b-907e-e24206128738","Type":"ContainerStarted","Data":"b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71"} Apr 22 20:17:51.357613 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:51.357291 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:17:51.374916 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:17:51.374845 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" podStartSLOduration=7.374831094 podStartE2EDuration="7.374831094s" podCreationTimestamp="2026-04-22 20:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:17:51.372403676 +0000 UTC m=+1159.792310464" watchObservedRunningTime="2026-04-22 20:17:51.374831094 +0000 UTC m=+1159.794737884" Apr 22 20:18:22.370503 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:22.370473 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:18:25.109511 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.109477 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf"] Apr 22 20:18:25.109966 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.109734 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" podUID="f1ac14c4-633d-4f6b-907e-e24206128738" containerName="kserve-container" containerID="cri-o://b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71" gracePeriod=30 Apr 22 20:18:25.168723 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.168686 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq"] Apr 22 20:18:25.169087 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.169074 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerName="kserve-container" Apr 22 20:18:25.169142 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.169088 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerName="kserve-container" Apr 22 20:18:25.169142 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.169109 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerName="storage-initializer" Apr 22 20:18:25.169142 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.169115 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerName="storage-initializer" Apr 22 20:18:25.169240 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.169172 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9f037d4-e36d-4cbe-95ed-9eebd399e933" containerName="kserve-container" Apr 22 20:18:25.172726 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.172702 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:18:25.182392 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.182365 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq"] Apr 22 20:18:25.236468 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.236432 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e78406-ea50-4890-ae63-5e1955091bd7-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-676648fc54-vt8mq\" (UID: \"23e78406-ea50-4890-ae63-5e1955091bd7\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:18:25.337922 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.337886 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e78406-ea50-4890-ae63-5e1955091bd7-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-676648fc54-vt8mq\" (UID: \"23e78406-ea50-4890-ae63-5e1955091bd7\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:18:25.338235 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.338218 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e78406-ea50-4890-ae63-5e1955091bd7-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-676648fc54-vt8mq\" (UID: \"23e78406-ea50-4890-ae63-5e1955091bd7\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:18:25.484553 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.484516 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:18:25.614681 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:25.614630 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq"] Apr 22 20:18:25.618681 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:18:25.618541 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e78406_ea50_4890_ae63_5e1955091bd7.slice/crio-107c407a436a7bb79ec712ef174998eee78468aafda3cced6fddb3c5abf10519 WatchSource:0}: Error finding container 107c407a436a7bb79ec712ef174998eee78468aafda3cced6fddb3c5abf10519: Status 404 returned error can't find the container with id 107c407a436a7bb79ec712ef174998eee78468aafda3cced6fddb3c5abf10519 Apr 22 20:18:26.488071 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:26.488030 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" event={"ID":"23e78406-ea50-4890-ae63-5e1955091bd7","Type":"ContainerStarted","Data":"1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010"} Apr 22 20:18:26.488071 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:26.488073 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" event={"ID":"23e78406-ea50-4890-ae63-5e1955091bd7","Type":"ContainerStarted","Data":"107c407a436a7bb79ec712ef174998eee78468aafda3cced6fddb3c5abf10519"} Apr 22 20:18:29.666774 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:29.666747 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:18:29.784164 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:29.784073 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac14c4-633d-4f6b-907e-e24206128738-kserve-provision-location\") pod \"f1ac14c4-633d-4f6b-907e-e24206128738\" (UID: \"f1ac14c4-633d-4f6b-907e-e24206128738\") " Apr 22 20:18:29.784432 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:29.784405 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ac14c4-633d-4f6b-907e-e24206128738-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1ac14c4-633d-4f6b-907e-e24206128738" (UID: "f1ac14c4-633d-4f6b-907e-e24206128738"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:18:29.885607 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:29.885558 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1ac14c4-633d-4f6b-907e-e24206128738-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:18:30.505163 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.505128 2583 generic.go:358] "Generic (PLEG): container finished" podID="23e78406-ea50-4890-ae63-5e1955091bd7" containerID="1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010" exitCode=0 Apr 22 20:18:30.505333 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.505206 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" event={"ID":"23e78406-ea50-4890-ae63-5e1955091bd7","Type":"ContainerDied","Data":"1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010"} Apr 22 20:18:30.506792 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.506769 2583 generic.go:358] "Generic (PLEG): container finished" podID="f1ac14c4-633d-4f6b-907e-e24206128738" containerID="b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71" exitCode=0 Apr 22 20:18:30.506902 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.506830 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" Apr 22 20:18:30.506902 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.506837 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" event={"ID":"f1ac14c4-633d-4f6b-907e-e24206128738","Type":"ContainerDied","Data":"b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71"} Apr 22 20:18:30.506902 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.506877 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf" event={"ID":"f1ac14c4-633d-4f6b-907e-e24206128738","Type":"ContainerDied","Data":"7b0fde2393af24f8a96d6c840a50542f9afccc4a953364e5f198898399b0a081"} Apr 22 20:18:30.506902 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.506896 2583 scope.go:117] "RemoveContainer" containerID="b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71" Apr 22 20:18:30.515201 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.515184 2583 scope.go:117] "RemoveContainer" containerID="3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec" Apr 22 20:18:30.525360 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.525335 2583 scope.go:117] "RemoveContainer" containerID="b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71" Apr 22 20:18:30.525618 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:18:30.525599 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71\": container with ID starting with b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71 not found: ID does not exist" containerID="b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71" Apr 22 20:18:30.525672 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.525628 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71"} err="failed to get container status \"b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71\": rpc error: code = NotFound desc = could not find container \"b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71\": container with ID starting with b0479caece4db67e662a0e9ec5399ac0412c4f06ee4b12202e9df86e28761c71 not found: ID does not exist" Apr 22 20:18:30.525672 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.525645 2583 scope.go:117] "RemoveContainer" containerID="3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec" Apr 22 20:18:30.525903 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:18:30.525885 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec\": container with ID starting with 3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec not found: ID does not exist" containerID="3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec" Apr 22 20:18:30.525952 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.525910 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec"} err="failed to get container status \"3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec\": rpc error: code = NotFound desc = could not find container \"3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec\": container with ID starting with 3945255900c9714c252a4d25ca56c7a16e775d7b47e812a2b44d9a0519d1abec not found: ID does not exist" Apr 22 20:18:30.543675 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.543644 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf"] Apr 22 20:18:30.546985 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:30.546962 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-g2mxf"] Apr 22 20:18:31.513279 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:31.513245 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" event={"ID":"23e78406-ea50-4890-ae63-5e1955091bd7","Type":"ContainerStarted","Data":"49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45"} Apr 22 20:18:32.099340 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:32.099165 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ac14c4-633d-4f6b-907e-e24206128738" path="/var/lib/kubelet/pods/f1ac14c4-633d-4f6b-907e-e24206128738/volumes" Apr 22 20:18:33.524598 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:33.524555 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" event={"ID":"23e78406-ea50-4890-ae63-5e1955091bd7","Type":"ContainerStarted","Data":"91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42"} Apr 22 20:18:33.525026 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:33.524790 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:18:33.525026 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:33.524820 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:18:33.541928 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:18:33.541856 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" podStartSLOduration=5.758687248 podStartE2EDuration="8.541842296s" podCreationTimestamp="2026-04-22 20:18:25 +0000 UTC" firstStartedPulling="2026-04-22 20:18:30.579629655 +0000 UTC m=+1198.999536440" lastFinishedPulling="2026-04-22 20:18:33.362784717 +0000 UTC m=+1201.782691488" observedRunningTime="2026-04-22 20:18:33.539515829 +0000 UTC m=+1201.959422645" watchObservedRunningTime="2026-04-22 20:18:33.541842296 +0000 UTC m=+1201.961749086" Apr 22 20:19:04.531186 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:04.531148 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:19:34.532028 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:34.531992 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:19:35.198317 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.198275 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq"] Apr 22 20:19:35.198698 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.198641 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-container" containerID="cri-o://49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45" gracePeriod=30 Apr 22 20:19:35.198698 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.198676 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-agent" containerID="cri-o://91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42" gracePeriod=30 Apr 22 20:19:35.234614 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.234584 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl"] Apr 22 20:19:35.235007 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.234994 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1ac14c4-633d-4f6b-907e-e24206128738" containerName="storage-initializer" Apr 22 20:19:35.235056 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.235009 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ac14c4-633d-4f6b-907e-e24206128738" containerName="storage-initializer" Apr 22 20:19:35.235056 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.235027 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1ac14c4-633d-4f6b-907e-e24206128738" containerName="kserve-container" Apr 22 20:19:35.235056 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.235033 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ac14c4-633d-4f6b-907e-e24206128738" containerName="kserve-container" Apr 22 20:19:35.235152 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.235095 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1ac14c4-633d-4f6b-907e-e24206128738" containerName="kserve-container" Apr 22 20:19:35.238675 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.238644 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:19:35.247729 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.247702 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl"] Apr 22 20:19:35.340335 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.340305 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41e60254-1707-4dc6-a262-7f65b32e8322-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-ghxdl\" (UID: \"41e60254-1707-4dc6-a262-7f65b32e8322\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:19:35.441237 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.441194 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41e60254-1707-4dc6-a262-7f65b32e8322-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-ghxdl\" (UID: \"41e60254-1707-4dc6-a262-7f65b32e8322\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:19:35.441577 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.441554 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41e60254-1707-4dc6-a262-7f65b32e8322-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-ghxdl\" (UID: \"41e60254-1707-4dc6-a262-7f65b32e8322\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:19:35.549141 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.549054 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:19:35.701223 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.701200 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl"] Apr 22 20:19:35.703821 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:19:35.703789 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e60254_1707_4dc6_a262_7f65b32e8322.slice/crio-49eb62691e50a1cdaf2d5d659c6ed41d179c89e63a3b0edfbde27f0a4c4bbbf8 WatchSource:0}: Error finding container 49eb62691e50a1cdaf2d5d659c6ed41d179c89e63a3b0edfbde27f0a4c4bbbf8: Status 404 returned error can't find the container with id 49eb62691e50a1cdaf2d5d659c6ed41d179c89e63a3b0edfbde27f0a4c4bbbf8 Apr 22 20:19:35.738530 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:35.738505 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" event={"ID":"41e60254-1707-4dc6-a262-7f65b32e8322","Type":"ContainerStarted","Data":"49eb62691e50a1cdaf2d5d659c6ed41d179c89e63a3b0edfbde27f0a4c4bbbf8"} Apr 22 20:19:36.743357 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:36.743321 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" event={"ID":"41e60254-1707-4dc6-a262-7f65b32e8322","Type":"ContainerStarted","Data":"292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00"} Apr 22 20:19:37.748517 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:37.748486 2583 generic.go:358] "Generic (PLEG): container finished" podID="23e78406-ea50-4890-ae63-5e1955091bd7" containerID="49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45" exitCode=0 Apr 22 20:19:37.748947 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:37.748575 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" event={"ID":"23e78406-ea50-4890-ae63-5e1955091bd7","Type":"ContainerDied","Data":"49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45"} Apr 22 20:19:40.761131 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:40.761092 2583 generic.go:358] "Generic (PLEG): container finished" podID="41e60254-1707-4dc6-a262-7f65b32e8322" containerID="292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00" exitCode=0 Apr 22 20:19:40.761615 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:40.761141 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" event={"ID":"41e60254-1707-4dc6-a262-7f65b32e8322","Type":"ContainerDied","Data":"292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00"} Apr 22 20:19:44.528222 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:44.528179 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 20:19:53.815504 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:53.815465 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" event={"ID":"41e60254-1707-4dc6-a262-7f65b32e8322","Type":"ContainerStarted","Data":"1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c"} Apr 22 20:19:53.816019 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:53.815828 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:19:53.817241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:53.817216 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 20:19:53.834794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:53.834755 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" podStartSLOduration=6.507420297 podStartE2EDuration="18.834744115s" podCreationTimestamp="2026-04-22 20:19:35 +0000 UTC" firstStartedPulling="2026-04-22 20:19:40.762487153 +0000 UTC m=+1269.182393923" lastFinishedPulling="2026-04-22 20:19:53.089810968 +0000 UTC m=+1281.509717741" observedRunningTime="2026-04-22 20:19:53.833235415 +0000 UTC m=+1282.253142202" watchObservedRunningTime="2026-04-22 20:19:53.834744115 +0000 UTC m=+1282.254650884" Apr 22 20:19:54.528230 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:54.528189 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 20:19:54.820087 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:19:54.819984 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 20:20:04.528264 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:04.528222 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 22 20:20:04.528666 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:04.528352 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:20:04.820275 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:04.820190 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 20:20:05.357210 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.357187 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:20:05.402474 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.402441 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e78406-ea50-4890-ae63-5e1955091bd7-kserve-provision-location\") pod \"23e78406-ea50-4890-ae63-5e1955091bd7\" (UID: \"23e78406-ea50-4890-ae63-5e1955091bd7\") " Apr 22 20:20:05.402824 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.402798 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e78406-ea50-4890-ae63-5e1955091bd7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "23e78406-ea50-4890-ae63-5e1955091bd7" (UID: "23e78406-ea50-4890-ae63-5e1955091bd7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:20:05.503747 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.503669 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23e78406-ea50-4890-ae63-5e1955091bd7-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:20:05.858592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.858510 2583 generic.go:358] "Generic (PLEG): container finished" podID="23e78406-ea50-4890-ae63-5e1955091bd7" containerID="91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42" exitCode=0 Apr 22 20:20:05.858592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.858568 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" event={"ID":"23e78406-ea50-4890-ae63-5e1955091bd7","Type":"ContainerDied","Data":"91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42"} Apr 22 20:20:05.858592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.858582 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" Apr 22 20:20:05.859154 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.858598 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq" event={"ID":"23e78406-ea50-4890-ae63-5e1955091bd7","Type":"ContainerDied","Data":"107c407a436a7bb79ec712ef174998eee78468aafda3cced6fddb3c5abf10519"} Apr 22 20:20:05.859154 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.858614 2583 scope.go:117] "RemoveContainer" containerID="91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42" Apr 22 20:20:05.867149 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.867131 2583 scope.go:117] "RemoveContainer" containerID="49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45" Apr 22 20:20:05.874639 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.874623 2583 scope.go:117] "RemoveContainer" containerID="1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010" Apr 22 20:20:05.880236 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.880214 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq"] Apr 22 20:20:05.881973 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.881946 2583 scope.go:117] "RemoveContainer" containerID="91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42" Apr 22 20:20:05.882211 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:20:05.882192 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42\": container with ID starting with 91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42 not found: ID does not exist" containerID="91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42" Apr 22 20:20:05.882295 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.882224 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42"} err="failed to get container status \"91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42\": rpc error: code = NotFound desc = could not find container \"91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42\": container with ID starting with 91e603df020d3a8c46790dd7f7593e28cb392a54e4e18926ec4da176b737cb42 not found: ID does not exist" Apr 22 20:20:05.882295 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.882250 2583 scope.go:117] "RemoveContainer" containerID="49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45" Apr 22 20:20:05.882513 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:20:05.882496 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45\": container with ID starting with 49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45 not found: ID does not exist" containerID="49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45" Apr 22 20:20:05.882565 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.882518 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45"} err="failed to get container status \"49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45\": rpc error: code = NotFound desc = could not find container \"49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45\": container with ID starting with 49c0c46ee22bf5e0a75b250e47cd2ac7d8cfc8fe899464b6deae8b672da22a45 not found: ID does not exist" Apr 22 20:20:05.882565 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.882533 2583 scope.go:117] "RemoveContainer" containerID="1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010" Apr 22 20:20:05.882748 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:20:05.882730 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010\": container with ID starting with 1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010 not found: ID does not exist" containerID="1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010" Apr 22 20:20:05.882809 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.882755 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010"} err="failed to get container status \"1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010\": rpc error: code = NotFound desc = could not find container \"1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010\": container with ID starting with 1a9c294b42f679326bb1609872891880ff0bbb0a3a575000affd444a26f1f010 not found: ID does not exist" Apr 22 20:20:05.885531 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:05.885513 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-676648fc54-vt8mq"] Apr 22 20:20:06.096761 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:06.096728 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" path="/var/lib/kubelet/pods/23e78406-ea50-4890-ae63-5e1955091bd7/volumes" Apr 22 20:20:14.820700 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:14.820647 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 20:20:24.820888 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:24.820812 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.47:8080: connect: connection refused" Apr 22 20:20:34.821755 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:34.821722 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:20:36.716181 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:36.716139 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl"] Apr 22 20:20:36.716586 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:36.716422 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" containerID="cri-o://1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c" gracePeriod=30 Apr 22 20:20:39.467469 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.467445 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:20:39.604951 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.604827 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41e60254-1707-4dc6-a262-7f65b32e8322-kserve-provision-location\") pod \"41e60254-1707-4dc6-a262-7f65b32e8322\" (UID: \"41e60254-1707-4dc6-a262-7f65b32e8322\") " Apr 22 20:20:39.614803 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.614768 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41e60254-1707-4dc6-a262-7f65b32e8322-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41e60254-1707-4dc6-a262-7f65b32e8322" (UID: "41e60254-1707-4dc6-a262-7f65b32e8322"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:20:39.705920 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.705886 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41e60254-1707-4dc6-a262-7f65b32e8322-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:20:39.979353 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.979320 2583 generic.go:358] "Generic (PLEG): container finished" podID="41e60254-1707-4dc6-a262-7f65b32e8322" containerID="1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c" exitCode=0 Apr 22 20:20:39.979536 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.979374 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" event={"ID":"41e60254-1707-4dc6-a262-7f65b32e8322","Type":"ContainerDied","Data":"1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c"} Apr 22 20:20:39.979536 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.979383 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" Apr 22 20:20:39.979536 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.979400 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl" event={"ID":"41e60254-1707-4dc6-a262-7f65b32e8322","Type":"ContainerDied","Data":"49eb62691e50a1cdaf2d5d659c6ed41d179c89e63a3b0edfbde27f0a4c4bbbf8"} Apr 22 20:20:39.979536 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.979417 2583 scope.go:117] "RemoveContainer" containerID="1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c" Apr 22 20:20:39.988411 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.988393 2583 scope.go:117] "RemoveContainer" containerID="292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00" Apr 22 20:20:39.995708 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.995691 2583 scope.go:117] "RemoveContainer" containerID="1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c" Apr 22 20:20:39.995955 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:20:39.995934 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c\": container with ID starting with 1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c not found: ID does not exist" containerID="1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c" Apr 22 20:20:39.996035 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.995967 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c"} err="failed to get container status \"1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c\": rpc error: code = NotFound desc = could not find container \"1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c\": container with ID starting with 1d2cc8525a0d0877954ce657383ede01ecefa91628b7a5c0bb5c6465b81b2b0c not found: ID does not exist" Apr 22 20:20:39.996035 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.995989 2583 scope.go:117] "RemoveContainer" containerID="292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00" Apr 22 20:20:39.996234 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:20:39.996215 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00\": container with ID starting with 292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00 not found: ID does not exist" containerID="292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00" Apr 22 20:20:39.996270 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.996241 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00"} err="failed to get container status \"292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00\": rpc error: code = NotFound desc = could not find container \"292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00\": container with ID starting with 292382cbb237bac3b31e8470a586338094ff222cdc4c3afe7527fba928a38e00 not found: ID does not exist" Apr 22 20:20:39.999189 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:39.999169 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl"] Apr 22 20:20:40.003076 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:40.003056 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-ghxdl"] Apr 22 20:20:40.097576 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:20:40.097547 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" path="/var/lib/kubelet/pods/41e60254-1707-4dc6-a262-7f65b32e8322/volumes" Apr 22 20:21:48.349244 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349210 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79"] Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349568 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="storage-initializer" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349579 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="storage-initializer" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349601 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349607 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349618 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-agent" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349623 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-agent" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349630 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="storage-initializer" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349635 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="storage-initializer" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349645 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-container" Apr 22 20:21:48.349680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349650 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-container" Apr 22 20:21:48.350062 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349701 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-container" Apr 22 20:21:48.350062 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349710 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="23e78406-ea50-4890-ae63-5e1955091bd7" containerName="kserve-agent" Apr 22 20:21:48.350062 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.349717 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="41e60254-1707-4dc6-a262-7f65b32e8322" containerName="kserve-container" Apr 22 20:21:48.352162 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.352145 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:21:48.354376 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.354346 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:21:48.360132 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.360110 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79"] Apr 22 20:21:48.386103 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.386066 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203d516e-2ebb-4285-a126-90872fa7f0be-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79\" (UID: \"203d516e-2ebb-4285-a126-90872fa7f0be\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:21:48.487146 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.487107 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203d516e-2ebb-4285-a126-90872fa7f0be-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79\" (UID: \"203d516e-2ebb-4285-a126-90872fa7f0be\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:21:48.487491 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.487472 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203d516e-2ebb-4285-a126-90872fa7f0be-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79\" (UID: \"203d516e-2ebb-4285-a126-90872fa7f0be\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:21:48.663338 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.663236 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:21:48.789299 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:48.789275 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79"] Apr 22 20:21:48.791705 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:21:48.791669 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203d516e_2ebb_4285_a126_90872fa7f0be.slice/crio-1014e218750bc733853f1b98c7df0d5ec78cacd3eebe80e9422f8d68db72ca7d WatchSource:0}: Error finding container 1014e218750bc733853f1b98c7df0d5ec78cacd3eebe80e9422f8d68db72ca7d: Status 404 returned error can't find the container with id 1014e218750bc733853f1b98c7df0d5ec78cacd3eebe80e9422f8d68db72ca7d Apr 22 20:21:49.224023 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:49.223986 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" event={"ID":"203d516e-2ebb-4285-a126-90872fa7f0be","Type":"ContainerStarted","Data":"8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8"} Apr 22 20:21:49.224023 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:49.224027 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" event={"ID":"203d516e-2ebb-4285-a126-90872fa7f0be","Type":"ContainerStarted","Data":"1014e218750bc733853f1b98c7df0d5ec78cacd3eebe80e9422f8d68db72ca7d"} Apr 22 20:21:54.242324 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:54.242292 2583 generic.go:358] "Generic (PLEG): container finished" podID="203d516e-2ebb-4285-a126-90872fa7f0be" containerID="8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8" exitCode=0 Apr 22 20:21:54.242708 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:54.242343 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" event={"ID":"203d516e-2ebb-4285-a126-90872fa7f0be","Type":"ContainerDied","Data":"8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8"} Apr 22 20:21:55.247827 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:55.247793 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" event={"ID":"203d516e-2ebb-4285-a126-90872fa7f0be","Type":"ContainerStarted","Data":"1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d"} Apr 22 20:21:55.248240 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:55.248136 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:21:55.249487 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:55.249464 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 20:21:55.264999 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:55.264951 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" podStartSLOduration=7.264937489 podStartE2EDuration="7.264937489s" podCreationTimestamp="2026-04-22 20:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:21:55.26271753 +0000 UTC m=+1403.682624355" watchObservedRunningTime="2026-04-22 20:21:55.264937489 +0000 UTC m=+1403.684844280" Apr 22 20:21:56.251458 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:21:56.251412 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 20:22:06.251882 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:06.251821 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 20:22:16.252249 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:16.252200 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 20:22:26.252028 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:26.251940 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.48:8080: connect: connection refused" Apr 22 20:22:36.252845 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:36.252810 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:22:40.127511 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:40.127480 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79"] Apr 22 20:22:40.127901 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:40.127768 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" containerID="cri-o://1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d" gracePeriod=30 Apr 22 20:22:42.872788 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:42.872765 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:22:43.036547 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.036462 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203d516e-2ebb-4285-a126-90872fa7f0be-kserve-provision-location\") pod \"203d516e-2ebb-4285-a126-90872fa7f0be\" (UID: \"203d516e-2ebb-4285-a126-90872fa7f0be\") " Apr 22 20:22:43.044711 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.044684 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203d516e-2ebb-4285-a126-90872fa7f0be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "203d516e-2ebb-4285-a126-90872fa7f0be" (UID: "203d516e-2ebb-4285-a126-90872fa7f0be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:22:43.137380 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.137347 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/203d516e-2ebb-4285-a126-90872fa7f0be-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:22:43.418266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.418170 2583 generic.go:358] "Generic (PLEG): container finished" podID="203d516e-2ebb-4285-a126-90872fa7f0be" containerID="1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d" exitCode=0 Apr 22 20:22:43.418266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.418255 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" Apr 22 20:22:43.418499 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.418254 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" event={"ID":"203d516e-2ebb-4285-a126-90872fa7f0be","Type":"ContainerDied","Data":"1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d"} Apr 22 20:22:43.418499 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.418378 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79" event={"ID":"203d516e-2ebb-4285-a126-90872fa7f0be","Type":"ContainerDied","Data":"1014e218750bc733853f1b98c7df0d5ec78cacd3eebe80e9422f8d68db72ca7d"} Apr 22 20:22:43.418499 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.418400 2583 scope.go:117] "RemoveContainer" containerID="1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d" Apr 22 20:22:43.427383 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.427368 2583 scope.go:117] "RemoveContainer" containerID="8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8" Apr 22 20:22:43.434824 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.434805 2583 scope.go:117] "RemoveContainer" containerID="1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d" Apr 22 20:22:43.435106 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:22:43.435087 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d\": container with ID starting with 1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d not found: ID does not exist" containerID="1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d" Apr 22 20:22:43.435151 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.435116 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d"} err="failed to get container status \"1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d\": rpc error: code = NotFound desc = could not find container \"1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d\": container with ID starting with 1250d28afabaa8d578e2bb6184ce5c3c04ca6ac19428a94a1874c37238b4990d not found: ID does not exist" Apr 22 20:22:43.435151 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.435133 2583 scope.go:117] "RemoveContainer" containerID="8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8" Apr 22 20:22:43.435356 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:22:43.435339 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8\": container with ID starting with 8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8 not found: ID does not exist" containerID="8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8" Apr 22 20:22:43.435395 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.435364 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8"} err="failed to get container status \"8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8\": rpc error: code = NotFound desc = could not find container \"8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8\": container with ID starting with 8b8fc0be9fd0adfc948280de498cba8ccba457a8d6f86178265b4add493880a8 not found: ID does not exist" Apr 22 20:22:43.438821 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.438800 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79"] Apr 22 20:22:43.442022 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:43.442002 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-t8c79"] Apr 22 20:22:44.096875 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:22:44.096820 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" path="/var/lib/kubelet/pods/203d516e-2ebb-4285-a126-90872fa7f0be/volumes" Apr 22 20:27:14.077484 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.077446 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47"] Apr 22 20:27:14.078031 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.077816 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" Apr 22 20:27:14.078031 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.077829 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" Apr 22 20:27:14.078031 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.077856 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="storage-initializer" Apr 22 20:27:14.078031 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.077878 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="storage-initializer" Apr 22 20:27:14.078031 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.077956 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="203d516e-2ebb-4285-a126-90872fa7f0be" containerName="kserve-container" Apr 22 20:27:14.081249 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.081226 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:27:14.083559 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.083536 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:27:14.089596 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.089568 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47"] Apr 22 20:27:14.148636 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.148603 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9219a74a-9c58-412a-8243-462e41d77e55-kserve-provision-location\") pod \"isvc-primary-adf655-predictor-6bc84c4dd-c2v47\" (UID: \"9219a74a-9c58-412a-8243-462e41d77e55\") " pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:27:14.249472 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.249436 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9219a74a-9c58-412a-8243-462e41d77e55-kserve-provision-location\") pod \"isvc-primary-adf655-predictor-6bc84c4dd-c2v47\" (UID: \"9219a74a-9c58-412a-8243-462e41d77e55\") " pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:27:14.249821 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.249800 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9219a74a-9c58-412a-8243-462e41d77e55-kserve-provision-location\") pod \"isvc-primary-adf655-predictor-6bc84c4dd-c2v47\" (UID: \"9219a74a-9c58-412a-8243-462e41d77e55\") " pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:27:14.394683 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.394588 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:27:14.519326 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.519292 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47"] Apr 22 20:27:14.521728 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:27:14.521699 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9219a74a_9c58_412a_8243_462e41d77e55.slice/crio-a917534c3c69e61815f4cb6eb465f3476bca8d71fa22ab83317b6bc10b4ddbbc WatchSource:0}: Error finding container a917534c3c69e61815f4cb6eb465f3476bca8d71fa22ab83317b6bc10b4ddbbc: Status 404 returned error can't find the container with id a917534c3c69e61815f4cb6eb465f3476bca8d71fa22ab83317b6bc10b4ddbbc Apr 22 20:27:14.523979 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:14.523958 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:27:15.362592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:15.362557 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" event={"ID":"9219a74a-9c58-412a-8243-462e41d77e55","Type":"ContainerStarted","Data":"99f86b637f87d6f9d9408c5f2f7ca12c8e4fc200dcdaf065add35863606760e1"} Apr 22 20:27:15.362592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:15.362595 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" event={"ID":"9219a74a-9c58-412a-8243-462e41d77e55","Type":"ContainerStarted","Data":"a917534c3c69e61815f4cb6eb465f3476bca8d71fa22ab83317b6bc10b4ddbbc"} Apr 22 20:27:18.374644 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:18.374613 2583 generic.go:358] "Generic (PLEG): container finished" podID="9219a74a-9c58-412a-8243-462e41d77e55" containerID="99f86b637f87d6f9d9408c5f2f7ca12c8e4fc200dcdaf065add35863606760e1" exitCode=0 Apr 22 20:27:18.374993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:18.374679 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" event={"ID":"9219a74a-9c58-412a-8243-462e41d77e55","Type":"ContainerDied","Data":"99f86b637f87d6f9d9408c5f2f7ca12c8e4fc200dcdaf065add35863606760e1"} Apr 22 20:27:19.381537 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:19.381500 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" event={"ID":"9219a74a-9c58-412a-8243-462e41d77e55","Type":"ContainerStarted","Data":"049ba4cfe0e983e89cbed8aa2f101913a681969bf4c6b450a56a2df200904195"} Apr 22 20:27:19.382070 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:19.381808 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:27:19.382830 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:19.382804 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:27:19.396210 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:19.396162 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podStartSLOduration=5.396149168 podStartE2EDuration="5.396149168s" podCreationTimestamp="2026-04-22 20:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:27:19.394824347 +0000 UTC m=+1727.814731140" watchObservedRunningTime="2026-04-22 20:27:19.396149168 +0000 UTC m=+1727.816055960" Apr 22 20:27:20.388147 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:20.388106 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:27:30.389137 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:30.389087 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:27:40.388499 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:40.388452 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:27:50.389122 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:27:50.389075 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:28:00.388959 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:00.388910 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:28:10.388244 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:10.388197 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:28:20.388795 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:20.388746 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:28:30.390001 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:30.389962 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:28:34.194231 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.194193 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f"] Apr 22 20:28:34.198004 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.197985 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:34.200110 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.200072 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-adf655\"" Apr 22 20:28:34.200110 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.200101 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-adf655-dockercfg-f2vlk\"" Apr 22 20:28:34.200274 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.200156 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 20:28:34.204663 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.204638 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f"] Apr 22 20:28:34.254245 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.254208 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/186595b5-92f1-42c2-90e6-80eee65e841d-kserve-provision-location\") pod \"isvc-secondary-adf655-predictor-5976877d49-nkt7f\" (UID: \"186595b5-92f1-42c2-90e6-80eee65e841d\") " pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:34.254408 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.254261 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/186595b5-92f1-42c2-90e6-80eee65e841d-cabundle-cert\") pod \"isvc-secondary-adf655-predictor-5976877d49-nkt7f\" (UID: \"186595b5-92f1-42c2-90e6-80eee65e841d\") " pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:34.355621 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.355582 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/186595b5-92f1-42c2-90e6-80eee65e841d-cabundle-cert\") pod \"isvc-secondary-adf655-predictor-5976877d49-nkt7f\" (UID: \"186595b5-92f1-42c2-90e6-80eee65e841d\") " pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:34.355822 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.355703 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/186595b5-92f1-42c2-90e6-80eee65e841d-kserve-provision-location\") pod \"isvc-secondary-adf655-predictor-5976877d49-nkt7f\" (UID: \"186595b5-92f1-42c2-90e6-80eee65e841d\") " pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:34.356095 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.356074 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/186595b5-92f1-42c2-90e6-80eee65e841d-kserve-provision-location\") pod \"isvc-secondary-adf655-predictor-5976877d49-nkt7f\" (UID: \"186595b5-92f1-42c2-90e6-80eee65e841d\") " pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:34.356277 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.356259 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/186595b5-92f1-42c2-90e6-80eee65e841d-cabundle-cert\") pod \"isvc-secondary-adf655-predictor-5976877d49-nkt7f\" (UID: \"186595b5-92f1-42c2-90e6-80eee65e841d\") " pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:34.509819 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.509748 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:34.632004 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.631975 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f"] Apr 22 20:28:34.634638 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:28:34.634605 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod186595b5_92f1_42c2_90e6_80eee65e841d.slice/crio-3ebea8721140f0276cd372cfcbd50a2d88b4f26a417ea0f30b157ed49c62374d WatchSource:0}: Error finding container 3ebea8721140f0276cd372cfcbd50a2d88b4f26a417ea0f30b157ed49c62374d: Status 404 returned error can't find the container with id 3ebea8721140f0276cd372cfcbd50a2d88b4f26a417ea0f30b157ed49c62374d Apr 22 20:28:34.644647 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:34.644622 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" event={"ID":"186595b5-92f1-42c2-90e6-80eee65e841d","Type":"ContainerStarted","Data":"3ebea8721140f0276cd372cfcbd50a2d88b4f26a417ea0f30b157ed49c62374d"} Apr 22 20:28:35.650889 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:35.650829 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" event={"ID":"186595b5-92f1-42c2-90e6-80eee65e841d","Type":"ContainerStarted","Data":"16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d"} Apr 22 20:28:37.659467 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:37.659391 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-adf655-predictor-5976877d49-nkt7f_186595b5-92f1-42c2-90e6-80eee65e841d/storage-initializer/0.log" Apr 22 20:28:37.659467 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:37.659429 2583 generic.go:358] "Generic (PLEG): container finished" podID="186595b5-92f1-42c2-90e6-80eee65e841d" containerID="16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d" exitCode=1 Apr 22 20:28:37.659909 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:37.659499 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" event={"ID":"186595b5-92f1-42c2-90e6-80eee65e841d","Type":"ContainerDied","Data":"16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d"} Apr 22 20:28:38.664013 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:38.663979 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-adf655-predictor-5976877d49-nkt7f_186595b5-92f1-42c2-90e6-80eee65e841d/storage-initializer/0.log" Apr 22 20:28:38.664394 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:38.664108 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" event={"ID":"186595b5-92f1-42c2-90e6-80eee65e841d","Type":"ContainerStarted","Data":"af9b74d2eea171998a1357d9c7e385020035717a2c6b66788e9d055324525d37"} Apr 22 20:28:41.674948 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:41.674920 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-adf655-predictor-5976877d49-nkt7f_186595b5-92f1-42c2-90e6-80eee65e841d/storage-initializer/1.log" Apr 22 20:28:41.675332 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:41.675252 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-adf655-predictor-5976877d49-nkt7f_186595b5-92f1-42c2-90e6-80eee65e841d/storage-initializer/0.log" Apr 22 20:28:41.675332 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:41.675286 2583 generic.go:358] "Generic (PLEG): container finished" podID="186595b5-92f1-42c2-90e6-80eee65e841d" containerID="af9b74d2eea171998a1357d9c7e385020035717a2c6b66788e9d055324525d37" exitCode=1 Apr 22 20:28:41.675408 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:41.675362 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" event={"ID":"186595b5-92f1-42c2-90e6-80eee65e841d","Type":"ContainerDied","Data":"af9b74d2eea171998a1357d9c7e385020035717a2c6b66788e9d055324525d37"} Apr 22 20:28:41.675440 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:41.675413 2583 scope.go:117] "RemoveContainer" containerID="16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d" Apr 22 20:28:41.675757 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:41.675743 2583 scope.go:117] "RemoveContainer" containerID="16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d" Apr 22 20:28:41.695774 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:28:41.695736 2583 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-adf655-predictor-5976877d49-nkt7f_kserve-ci-e2e-test_186595b5-92f1-42c2-90e6-80eee65e841d_0 in pod sandbox 3ebea8721140f0276cd372cfcbd50a2d88b4f26a417ea0f30b157ed49c62374d from index: no such id: '16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d'" containerID="16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d" Apr 22 20:28:41.695897 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:41.695790 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-adf655-predictor-5976877d49-nkt7f_kserve-ci-e2e-test_186595b5-92f1-42c2-90e6-80eee65e841d_0 in pod sandbox 3ebea8721140f0276cd372cfcbd50a2d88b4f26a417ea0f30b157ed49c62374d from index: no such id: '16d6fb16238148fb2893b4c3568234ff5e61f3368bb4b1783db53ae0e427094d'" Apr 22 20:28:41.696283 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:28:41.696258 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-adf655-predictor-5976877d49-nkt7f_kserve-ci-e2e-test(186595b5-92f1-42c2-90e6-80eee65e841d)\"" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" podUID="186595b5-92f1-42c2-90e6-80eee65e841d" Apr 22 20:28:42.680129 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:42.680102 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-adf655-predictor-5976877d49-nkt7f_186595b5-92f1-42c2-90e6-80eee65e841d/storage-initializer/1.log" Apr 22 20:28:48.245369 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.245324 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f"] Apr 22 20:28:48.310702 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.310670 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47"] Apr 22 20:28:48.311043 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.311017 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" containerID="cri-o://049ba4cfe0e983e89cbed8aa2f101913a681969bf4c6b450a56a2df200904195" gracePeriod=30 Apr 22 20:28:48.368136 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.368104 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2"] Apr 22 20:28:48.373036 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.373016 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:48.375272 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.375251 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-882905-dockercfg-8n2kh\"" Apr 22 20:28:48.375389 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.375255 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-882905\"" Apr 22 20:28:48.381714 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.381689 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2"] Apr 22 20:28:48.402614 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.402596 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-adf655-predictor-5976877d49-nkt7f_186595b5-92f1-42c2-90e6-80eee65e841d/storage-initializer/1.log" Apr 22 20:28:48.402720 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.402650 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:48.486423 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.486397 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/186595b5-92f1-42c2-90e6-80eee65e841d-kserve-provision-location\") pod \"186595b5-92f1-42c2-90e6-80eee65e841d\" (UID: \"186595b5-92f1-42c2-90e6-80eee65e841d\") " Apr 22 20:28:48.486592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.486447 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/186595b5-92f1-42c2-90e6-80eee65e841d-cabundle-cert\") pod \"186595b5-92f1-42c2-90e6-80eee65e841d\" (UID: \"186595b5-92f1-42c2-90e6-80eee65e841d\") " Apr 22 20:28:48.486652 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.486598 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29ea1af5-5dfa-474b-9605-bcd605d63485-kserve-provision-location\") pod \"isvc-init-fail-882905-predictor-6df744d887-rh4d2\" (UID: \"29ea1af5-5dfa-474b-9605-bcd605d63485\") " pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:48.486704 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.486664 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186595b5-92f1-42c2-90e6-80eee65e841d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "186595b5-92f1-42c2-90e6-80eee65e841d" (UID: "186595b5-92f1-42c2-90e6-80eee65e841d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:28:48.486777 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.486755 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/29ea1af5-5dfa-474b-9605-bcd605d63485-cabundle-cert\") pod \"isvc-init-fail-882905-predictor-6df744d887-rh4d2\" (UID: \"29ea1af5-5dfa-474b-9605-bcd605d63485\") " pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:48.486823 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.486810 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/186595b5-92f1-42c2-90e6-80eee65e841d-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:28:48.486914 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.486853 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/186595b5-92f1-42c2-90e6-80eee65e841d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "186595b5-92f1-42c2-90e6-80eee65e841d" (UID: "186595b5-92f1-42c2-90e6-80eee65e841d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:28:48.587993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.587906 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/29ea1af5-5dfa-474b-9605-bcd605d63485-cabundle-cert\") pod \"isvc-init-fail-882905-predictor-6df744d887-rh4d2\" (UID: \"29ea1af5-5dfa-474b-9605-bcd605d63485\") " pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:48.587993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.587964 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29ea1af5-5dfa-474b-9605-bcd605d63485-kserve-provision-location\") pod \"isvc-init-fail-882905-predictor-6df744d887-rh4d2\" (UID: \"29ea1af5-5dfa-474b-9605-bcd605d63485\") " pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:48.588190 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.588074 2583 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/186595b5-92f1-42c2-90e6-80eee65e841d-cabundle-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:28:48.588294 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.588277 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29ea1af5-5dfa-474b-9605-bcd605d63485-kserve-provision-location\") pod \"isvc-init-fail-882905-predictor-6df744d887-rh4d2\" (UID: \"29ea1af5-5dfa-474b-9605-bcd605d63485\") " pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:48.588519 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.588500 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/29ea1af5-5dfa-474b-9605-bcd605d63485-cabundle-cert\") pod \"isvc-init-fail-882905-predictor-6df744d887-rh4d2\" (UID: \"29ea1af5-5dfa-474b-9605-bcd605d63485\") " pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:48.701130 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.701091 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:48.711321 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.711288 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-adf655-predictor-5976877d49-nkt7f_186595b5-92f1-42c2-90e6-80eee65e841d/storage-initializer/1.log" Apr 22 20:28:48.711448 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.711432 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" Apr 22 20:28:48.711555 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.711426 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f" event={"ID":"186595b5-92f1-42c2-90e6-80eee65e841d","Type":"ContainerDied","Data":"3ebea8721140f0276cd372cfcbd50a2d88b4f26a417ea0f30b157ed49c62374d"} Apr 22 20:28:48.711681 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.711578 2583 scope.go:117] "RemoveContainer" containerID="af9b74d2eea171998a1357d9c7e385020035717a2c6b66788e9d055324525d37" Apr 22 20:28:48.752660 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.752628 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f"] Apr 22 20:28:48.755500 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.755469 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-adf655-predictor-5976877d49-nkt7f"] Apr 22 20:28:48.836323 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:48.836296 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2"] Apr 22 20:28:48.838209 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:28:48.838140 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29ea1af5_5dfa_474b_9605_bcd605d63485.slice/crio-b6ebd48cf65ba0ec9a03c7dbd899466ef42ffd907167e918a250b0fb55dbf6ce WatchSource:0}: Error finding container b6ebd48cf65ba0ec9a03c7dbd899466ef42ffd907167e918a250b0fb55dbf6ce: Status 404 returned error can't find the container with id b6ebd48cf65ba0ec9a03c7dbd899466ef42ffd907167e918a250b0fb55dbf6ce Apr 22 20:28:49.717213 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:49.717173 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" event={"ID":"29ea1af5-5dfa-474b-9605-bcd605d63485","Type":"ContainerStarted","Data":"fe1f291cefbf3a584c4e6b5583202762c5f929849320e2c0a22d84914469869e"} Apr 22 20:28:49.717636 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:49.717215 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" event={"ID":"29ea1af5-5dfa-474b-9605-bcd605d63485","Type":"ContainerStarted","Data":"b6ebd48cf65ba0ec9a03c7dbd899466ef42ffd907167e918a250b0fb55dbf6ce"} Apr 22 20:28:50.097146 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:50.097060 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186595b5-92f1-42c2-90e6-80eee65e841d" path="/var/lib/kubelet/pods/186595b5-92f1-42c2-90e6-80eee65e841d/volumes" Apr 22 20:28:50.388496 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:50.388398 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 22 20:28:52.728527 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:52.728494 2583 generic.go:358] "Generic (PLEG): container finished" podID="9219a74a-9c58-412a-8243-462e41d77e55" containerID="049ba4cfe0e983e89cbed8aa2f101913a681969bf4c6b450a56a2df200904195" exitCode=0 Apr 22 20:28:52.728953 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:52.728570 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" event={"ID":"9219a74a-9c58-412a-8243-462e41d77e55","Type":"ContainerDied","Data":"049ba4cfe0e983e89cbed8aa2f101913a681969bf4c6b450a56a2df200904195"} Apr 22 20:28:52.759222 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:52.759200 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:28:52.926962 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:52.926852 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9219a74a-9c58-412a-8243-462e41d77e55-kserve-provision-location\") pod \"9219a74a-9c58-412a-8243-462e41d77e55\" (UID: \"9219a74a-9c58-412a-8243-462e41d77e55\") " Apr 22 20:28:52.927183 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:52.927159 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9219a74a-9c58-412a-8243-462e41d77e55-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9219a74a-9c58-412a-8243-462e41d77e55" (UID: "9219a74a-9c58-412a-8243-462e41d77e55"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:28:53.027992 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.027942 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9219a74a-9c58-412a-8243-462e41d77e55-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:28:53.733470 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.733442 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" Apr 22 20:28:53.733993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.733435 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47" event={"ID":"9219a74a-9c58-412a-8243-462e41d77e55","Type":"ContainerDied","Data":"a917534c3c69e61815f4cb6eb465f3476bca8d71fa22ab83317b6bc10b4ddbbc"} Apr 22 20:28:53.733993 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.733613 2583 scope.go:117] "RemoveContainer" containerID="049ba4cfe0e983e89cbed8aa2f101913a681969bf4c6b450a56a2df200904195" Apr 22 20:28:53.735078 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.735061 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-882905-predictor-6df744d887-rh4d2_29ea1af5-5dfa-474b-9605-bcd605d63485/storage-initializer/0.log" Apr 22 20:28:53.735207 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.735100 2583 generic.go:358] "Generic (PLEG): container finished" podID="29ea1af5-5dfa-474b-9605-bcd605d63485" containerID="fe1f291cefbf3a584c4e6b5583202762c5f929849320e2c0a22d84914469869e" exitCode=1 Apr 22 20:28:53.735207 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.735168 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" event={"ID":"29ea1af5-5dfa-474b-9605-bcd605d63485","Type":"ContainerDied","Data":"fe1f291cefbf3a584c4e6b5583202762c5f929849320e2c0a22d84914469869e"} Apr 22 20:28:53.746529 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.746494 2583 scope.go:117] "RemoveContainer" containerID="99f86b637f87d6f9d9408c5f2f7ca12c8e4fc200dcdaf065add35863606760e1" Apr 22 20:28:53.762746 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.762678 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47"] Apr 22 20:28:53.765180 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:53.765158 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-adf655-predictor-6bc84c4dd-c2v47"] Apr 22 20:28:54.096987 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:54.096917 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9219a74a-9c58-412a-8243-462e41d77e55" path="/var/lib/kubelet/pods/9219a74a-9c58-412a-8243-462e41d77e55/volumes" Apr 22 20:28:54.740576 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:54.740550 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-882905-predictor-6df744d887-rh4d2_29ea1af5-5dfa-474b-9605-bcd605d63485/storage-initializer/0.log" Apr 22 20:28:54.740975 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:54.740623 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" event={"ID":"29ea1af5-5dfa-474b-9605-bcd605d63485","Type":"ContainerStarted","Data":"63cf9d196a9da46f310f427c261d6c277a690f227b907385dab895348c0e540f"} Apr 22 20:28:56.748761 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:56.748730 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-882905-predictor-6df744d887-rh4d2_29ea1af5-5dfa-474b-9605-bcd605d63485/storage-initializer/1.log" Apr 22 20:28:56.749206 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:56.749062 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-882905-predictor-6df744d887-rh4d2_29ea1af5-5dfa-474b-9605-bcd605d63485/storage-initializer/0.log" Apr 22 20:28:56.749206 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:56.749097 2583 generic.go:358] "Generic (PLEG): container finished" podID="29ea1af5-5dfa-474b-9605-bcd605d63485" containerID="63cf9d196a9da46f310f427c261d6c277a690f227b907385dab895348c0e540f" exitCode=1 Apr 22 20:28:56.749206 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:56.749172 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" event={"ID":"29ea1af5-5dfa-474b-9605-bcd605d63485","Type":"ContainerDied","Data":"63cf9d196a9da46f310f427c261d6c277a690f227b907385dab895348c0e540f"} Apr 22 20:28:56.749560 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:56.749215 2583 scope.go:117] "RemoveContainer" containerID="fe1f291cefbf3a584c4e6b5583202762c5f929849320e2c0a22d84914469869e" Apr 22 20:28:56.749616 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:56.749600 2583 scope.go:117] "RemoveContainer" containerID="fe1f291cefbf3a584c4e6b5583202762c5f929849320e2c0a22d84914469869e" Apr 22 20:28:56.760644 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:28:56.760614 2583 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-882905-predictor-6df744d887-rh4d2_kserve-ci-e2e-test_29ea1af5-5dfa-474b-9605-bcd605d63485_0 in pod sandbox b6ebd48cf65ba0ec9a03c7dbd899466ef42ffd907167e918a250b0fb55dbf6ce from index: no such id: 'fe1f291cefbf3a584c4e6b5583202762c5f929849320e2c0a22d84914469869e'" containerID="fe1f291cefbf3a584c4e6b5583202762c5f929849320e2c0a22d84914469869e" Apr 22 20:28:56.760722 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:28:56.760663 2583 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-882905-predictor-6df744d887-rh4d2_kserve-ci-e2e-test_29ea1af5-5dfa-474b-9605-bcd605d63485_0 in pod sandbox b6ebd48cf65ba0ec9a03c7dbd899466ef42ffd907167e918a250b0fb55dbf6ce from index: no such id: 'fe1f291cefbf3a584c4e6b5583202762c5f929849320e2c0a22d84914469869e'; Skipping pod \"isvc-init-fail-882905-predictor-6df744d887-rh4d2_kserve-ci-e2e-test(29ea1af5-5dfa-474b-9605-bcd605d63485)\"" logger="UnhandledError" Apr 22 20:28:56.762045 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:28:56.762023 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-882905-predictor-6df744d887-rh4d2_kserve-ci-e2e-test(29ea1af5-5dfa-474b-9605-bcd605d63485)\"" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" podUID="29ea1af5-5dfa-474b-9605-bcd605d63485" Apr 22 20:28:57.757472 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:57.757445 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-882905-predictor-6df744d887-rh4d2_29ea1af5-5dfa-474b-9605-bcd605d63485/storage-initializer/1.log" Apr 22 20:28:58.375928 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.375899 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2"] Apr 22 20:28:58.524107 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.524083 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-882905-predictor-6df744d887-rh4d2_29ea1af5-5dfa-474b-9605-bcd605d63485/storage-initializer/1.log" Apr 22 20:28:58.524226 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.524149 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:58.676668 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.676579 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29ea1af5-5dfa-474b-9605-bcd605d63485-kserve-provision-location\") pod \"29ea1af5-5dfa-474b-9605-bcd605d63485\" (UID: \"29ea1af5-5dfa-474b-9605-bcd605d63485\") " Apr 22 20:28:58.676668 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.676657 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/29ea1af5-5dfa-474b-9605-bcd605d63485-cabundle-cert\") pod \"29ea1af5-5dfa-474b-9605-bcd605d63485\" (UID: \"29ea1af5-5dfa-474b-9605-bcd605d63485\") " Apr 22 20:28:58.676893 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.676834 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29ea1af5-5dfa-474b-9605-bcd605d63485-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "29ea1af5-5dfa-474b-9605-bcd605d63485" (UID: "29ea1af5-5dfa-474b-9605-bcd605d63485"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:28:58.676989 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.676968 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29ea1af5-5dfa-474b-9605-bcd605d63485-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "29ea1af5-5dfa-474b-9605-bcd605d63485" (UID: "29ea1af5-5dfa-474b-9605-bcd605d63485"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:28:58.762542 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.762513 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-882905-predictor-6df744d887-rh4d2_29ea1af5-5dfa-474b-9605-bcd605d63485/storage-initializer/1.log" Apr 22 20:28:58.763003 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.762579 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" event={"ID":"29ea1af5-5dfa-474b-9605-bcd605d63485","Type":"ContainerDied","Data":"b6ebd48cf65ba0ec9a03c7dbd899466ef42ffd907167e918a250b0fb55dbf6ce"} Apr 22 20:28:58.763003 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.762622 2583 scope.go:117] "RemoveContainer" containerID="63cf9d196a9da46f310f427c261d6c277a690f227b907385dab895348c0e540f" Apr 22 20:28:58.763003 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.762642 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2" Apr 22 20:28:58.777359 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.777339 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29ea1af5-5dfa-474b-9605-bcd605d63485-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:28:58.777359 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.777361 2583 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/29ea1af5-5dfa-474b-9605-bcd605d63485-cabundle-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:28:58.796325 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.796303 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2"] Apr 22 20:28:58.801885 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:28:58.801831 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-882905-predictor-6df744d887-rh4d2"] Apr 22 20:29:00.096707 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:29:00.096664 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29ea1af5-5dfa-474b-9605-bcd605d63485" path="/var/lib/kubelet/pods/29ea1af5-5dfa-474b-9605-bcd605d63485/volumes" Apr 22 20:37:51.801821 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.801780 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr"] Apr 22 20:37:51.802261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802187 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="186595b5-92f1-42c2-90e6-80eee65e841d" containerName="storage-initializer" Apr 22 20:37:51.802261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802199 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="186595b5-92f1-42c2-90e6-80eee65e841d" containerName="storage-initializer" Apr 22 20:37:51.802261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802217 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" Apr 22 20:37:51.802261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802226 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" Apr 22 20:37:51.802261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802240 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="186595b5-92f1-42c2-90e6-80eee65e841d" containerName="storage-initializer" Apr 22 20:37:51.802261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802246 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="186595b5-92f1-42c2-90e6-80eee65e841d" containerName="storage-initializer" Apr 22 20:37:51.802261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802257 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29ea1af5-5dfa-474b-9605-bcd605d63485" containerName="storage-initializer" Apr 22 20:37:51.802261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802262 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ea1af5-5dfa-474b-9605-bcd605d63485" containerName="storage-initializer" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802270 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="storage-initializer" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802276 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="storage-initializer" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802339 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="29ea1af5-5dfa-474b-9605-bcd605d63485" containerName="storage-initializer" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802375 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9219a74a-9c58-412a-8243-462e41d77e55" containerName="kserve-container" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802386 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="29ea1af5-5dfa-474b-9605-bcd605d63485" containerName="storage-initializer" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802397 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="186595b5-92f1-42c2-90e6-80eee65e841d" containerName="storage-initializer" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802404 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="186595b5-92f1-42c2-90e6-80eee65e841d" containerName="storage-initializer" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802482 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29ea1af5-5dfa-474b-9605-bcd605d63485" containerName="storage-initializer" Apr 22 20:37:51.802510 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.802488 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="29ea1af5-5dfa-474b-9605-bcd605d63485" containerName="storage-initializer" Apr 22 20:37:51.805619 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.805602 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:37:51.807562 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.807538 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:37:51.813271 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.813225 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr"] Apr 22 20:37:51.901889 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.897787 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e90fc3d-0217-4f5b-99f5-88ba17961877-kserve-provision-location\") pod \"isvc-sklearn-predictor-54bd9d6bf7-djjcr\" (UID: \"4e90fc3d-0217-4f5b-99f5-88ba17961877\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:37:51.998852 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.998756 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e90fc3d-0217-4f5b-99f5-88ba17961877-kserve-provision-location\") pod \"isvc-sklearn-predictor-54bd9d6bf7-djjcr\" (UID: \"4e90fc3d-0217-4f5b-99f5-88ba17961877\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:37:51.999158 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:51.999137 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e90fc3d-0217-4f5b-99f5-88ba17961877-kserve-provision-location\") pod \"isvc-sklearn-predictor-54bd9d6bf7-djjcr\" (UID: \"4e90fc3d-0217-4f5b-99f5-88ba17961877\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:37:52.116385 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:52.116341 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:37:52.242009 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:52.241976 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr"] Apr 22 20:37:52.245721 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:37:52.245691 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e90fc3d_0217_4f5b_99f5_88ba17961877.slice/crio-38154dffefd91d9f9a979757bf58c5f7652986fefe670c7c6b6e02d9def337c1 WatchSource:0}: Error finding container 38154dffefd91d9f9a979757bf58c5f7652986fefe670c7c6b6e02d9def337c1: Status 404 returned error can't find the container with id 38154dffefd91d9f9a979757bf58c5f7652986fefe670c7c6b6e02d9def337c1 Apr 22 20:37:52.247879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:52.247848 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:37:52.557982 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:52.557889 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" event={"ID":"4e90fc3d-0217-4f5b-99f5-88ba17961877","Type":"ContainerStarted","Data":"5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f"} Apr 22 20:37:52.557982 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:52.557931 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" event={"ID":"4e90fc3d-0217-4f5b-99f5-88ba17961877","Type":"ContainerStarted","Data":"38154dffefd91d9f9a979757bf58c5f7652986fefe670c7c6b6e02d9def337c1"} Apr 22 20:37:56.572803 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:56.572713 2583 generic.go:358] "Generic (PLEG): container finished" podID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerID="5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f" exitCode=0 Apr 22 20:37:56.572803 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:56.572788 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" event={"ID":"4e90fc3d-0217-4f5b-99f5-88ba17961877","Type":"ContainerDied","Data":"5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f"} Apr 22 20:37:57.577615 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:57.577577 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" event={"ID":"4e90fc3d-0217-4f5b-99f5-88ba17961877","Type":"ContainerStarted","Data":"1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b"} Apr 22 20:37:57.578045 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:57.577902 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:37:57.579356 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:57.579325 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 20:37:57.593713 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:57.593654 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podStartSLOduration=6.59363968 podStartE2EDuration="6.59363968s" podCreationTimestamp="2026-04-22 20:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:37:57.591015066 +0000 UTC m=+2366.010921856" watchObservedRunningTime="2026-04-22 20:37:57.59363968 +0000 UTC m=+2366.013546470" Apr 22 20:37:58.587678 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:37:58.587436 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 20:38:08.582926 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:38:08.582847 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 20:38:18.583548 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:38:18.583499 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 20:38:28.582724 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:38:28.582669 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 20:38:38.582933 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:38:38.582885 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 20:38:48.582630 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:38:48.582575 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 20:38:58.582418 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:38:58.582374 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 22 20:39:08.096052 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:08.096024 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:39:11.915492 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:11.915452 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr"] Apr 22 20:39:11.915954 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:11.915779 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" containerID="cri-o://1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b" gracePeriod=30 Apr 22 20:39:11.966217 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:11.966186 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v"] Apr 22 20:39:11.969886 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:11.969841 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:39:11.976550 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:11.976524 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v"] Apr 22 20:39:12.014380 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:12.014353 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a15399d5-9b43-47dd-a9d9-2a228195b63b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-jh95v\" (UID: \"a15399d5-9b43-47dd-a9d9-2a228195b63b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:39:12.115358 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:12.115316 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a15399d5-9b43-47dd-a9d9-2a228195b63b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-jh95v\" (UID: \"a15399d5-9b43-47dd-a9d9-2a228195b63b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:39:12.115712 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:12.115690 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a15399d5-9b43-47dd-a9d9-2a228195b63b-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-jh95v\" (UID: \"a15399d5-9b43-47dd-a9d9-2a228195b63b\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:39:12.281773 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:12.281737 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:39:12.435025 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:12.435004 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v"] Apr 22 20:39:12.437529 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:39:12.437502 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda15399d5_9b43_47dd_a9d9_2a228195b63b.slice/crio-9c1bfff7e7392aabd46bf4b013844764458b17173dec39435d97a07b37ceead1 WatchSource:0}: Error finding container 9c1bfff7e7392aabd46bf4b013844764458b17173dec39435d97a07b37ceead1: Status 404 returned error can't find the container with id 9c1bfff7e7392aabd46bf4b013844764458b17173dec39435d97a07b37ceead1 Apr 22 20:39:12.838300 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:12.838261 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" event={"ID":"a15399d5-9b43-47dd-a9d9-2a228195b63b","Type":"ContainerStarted","Data":"f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224"} Apr 22 20:39:12.838300 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:12.838305 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" event={"ID":"a15399d5-9b43-47dd-a9d9-2a228195b63b","Type":"ContainerStarted","Data":"9c1bfff7e7392aabd46bf4b013844764458b17173dec39435d97a07b37ceead1"} Apr 22 20:39:16.472583 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.472554 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:39:16.557314 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.557218 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e90fc3d-0217-4f5b-99f5-88ba17961877-kserve-provision-location\") pod \"4e90fc3d-0217-4f5b-99f5-88ba17961877\" (UID: \"4e90fc3d-0217-4f5b-99f5-88ba17961877\") " Apr 22 20:39:16.557579 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.557555 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e90fc3d-0217-4f5b-99f5-88ba17961877-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4e90fc3d-0217-4f5b-99f5-88ba17961877" (UID: "4e90fc3d-0217-4f5b-99f5-88ba17961877"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:39:16.658542 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.658512 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4e90fc3d-0217-4f5b-99f5-88ba17961877-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:39:16.858949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.858844 2583 generic.go:358] "Generic (PLEG): container finished" podID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerID="f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224" exitCode=0 Apr 22 20:39:16.858949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.858921 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" event={"ID":"a15399d5-9b43-47dd-a9d9-2a228195b63b","Type":"ContainerDied","Data":"f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224"} Apr 22 20:39:16.860496 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.860469 2583 generic.go:358] "Generic (PLEG): container finished" podID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerID="1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b" exitCode=0 Apr 22 20:39:16.860633 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.860513 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" event={"ID":"4e90fc3d-0217-4f5b-99f5-88ba17961877","Type":"ContainerDied","Data":"1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b"} Apr 22 20:39:16.860633 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.860533 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" event={"ID":"4e90fc3d-0217-4f5b-99f5-88ba17961877","Type":"ContainerDied","Data":"38154dffefd91d9f9a979757bf58c5f7652986fefe670c7c6b6e02d9def337c1"} Apr 22 20:39:16.860633 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.860539 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr" Apr 22 20:39:16.860811 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.860548 2583 scope.go:117] "RemoveContainer" containerID="1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b" Apr 22 20:39:16.870364 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.870341 2583 scope.go:117] "RemoveContainer" containerID="5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f" Apr 22 20:39:16.878828 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.878810 2583 scope.go:117] "RemoveContainer" containerID="1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b" Apr 22 20:39:16.879109 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:39:16.879090 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b\": container with ID starting with 1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b not found: ID does not exist" containerID="1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b" Apr 22 20:39:16.879162 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.879118 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b"} err="failed to get container status \"1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b\": rpc error: code = NotFound desc = could not find container \"1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b\": container with ID starting with 1a45e8c91c89bdb69b64837ce41037125504cab328798de5985a39d182a87e9b not found: ID does not exist" Apr 22 20:39:16.879162 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.879137 2583 scope.go:117] "RemoveContainer" containerID="5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f" Apr 22 20:39:16.879420 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:39:16.879399 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f\": container with ID starting with 5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f not found: ID does not exist" containerID="5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f" Apr 22 20:39:16.879483 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.879429 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f"} err="failed to get container status \"5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f\": rpc error: code = NotFound desc = could not find container \"5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f\": container with ID starting with 5b07c63e1bef1195db68b83f1560508b4ffe7bdc8aede62fdaad4c7c48ad284f not found: ID does not exist" Apr 22 20:39:16.886014 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.885990 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr"] Apr 22 20:39:16.888949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:16.888929 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-54bd9d6bf7-djjcr"] Apr 22 20:39:17.866270 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:17.866235 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" event={"ID":"a15399d5-9b43-47dd-a9d9-2a228195b63b","Type":"ContainerStarted","Data":"ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209"} Apr 22 20:39:17.866662 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:17.866459 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:39:17.882488 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:17.882442 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" podStartSLOduration=6.882427174 podStartE2EDuration="6.882427174s" podCreationTimestamp="2026-04-22 20:39:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:39:17.880646073 +0000 UTC m=+2446.300552886" watchObservedRunningTime="2026-04-22 20:39:17.882427174 +0000 UTC m=+2446.302333966" Apr 22 20:39:18.097664 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:18.097628 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" path="/var/lib/kubelet/pods/4e90fc3d-0217-4f5b-99f5-88ba17961877/volumes" Apr 22 20:39:48.894022 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:48.893970 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 20:39:58.872871 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:39:58.872825 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:40:02.066489 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.066452 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v"] Apr 22 20:40:02.066915 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.066732 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerName="kserve-container" containerID="cri-o://ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209" gracePeriod=30 Apr 22 20:40:02.126514 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.126476 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx"] Apr 22 20:40:02.126873 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.126846 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" Apr 22 20:40:02.126925 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.126875 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" Apr 22 20:40:02.126925 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.126894 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="storage-initializer" Apr 22 20:40:02.126925 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.126901 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="storage-initializer" Apr 22 20:40:02.127023 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.126962 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e90fc3d-0217-4f5b-99f5-88ba17961877" containerName="kserve-container" Apr 22 20:40:02.130092 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.130072 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:02.138783 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.138758 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx"] Apr 22 20:40:02.152682 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.152649 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760a684c-ddb8-4044-bc2a-fe942d456384-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx\" (UID: \"760a684c-ddb8-4044-bc2a-fe942d456384\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:02.254073 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.254034 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760a684c-ddb8-4044-bc2a-fe942d456384-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx\" (UID: \"760a684c-ddb8-4044-bc2a-fe942d456384\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:02.254405 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.254386 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760a684c-ddb8-4044-bc2a-fe942d456384-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx\" (UID: \"760a684c-ddb8-4044-bc2a-fe942d456384\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:02.442056 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.442017 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:02.568482 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:02.568454 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx"] Apr 22 20:40:02.570538 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:40:02.570509 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod760a684c_ddb8_4044_bc2a_fe942d456384.slice/crio-87d4d339bf82c71d1f886486f3eb7914493a8761143032ba3ea32b82f18146a8 WatchSource:0}: Error finding container 87d4d339bf82c71d1f886486f3eb7914493a8761143032ba3ea32b82f18146a8: Status 404 returned error can't find the container with id 87d4d339bf82c71d1f886486f3eb7914493a8761143032ba3ea32b82f18146a8 Apr 22 20:40:03.023734 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:03.023694 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" event={"ID":"760a684c-ddb8-4044-bc2a-fe942d456384","Type":"ContainerStarted","Data":"dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd"} Apr 22 20:40:03.023734 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:03.023733 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" event={"ID":"760a684c-ddb8-4044-bc2a-fe942d456384","Type":"ContainerStarted","Data":"87d4d339bf82c71d1f886486f3eb7914493a8761143032ba3ea32b82f18146a8"} Apr 22 20:40:08.042731 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:08.042698 2583 generic.go:358] "Generic (PLEG): container finished" podID="760a684c-ddb8-4044-bc2a-fe942d456384" containerID="dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd" exitCode=0 Apr 22 20:40:08.043158 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:08.042778 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" event={"ID":"760a684c-ddb8-4044-bc2a-fe942d456384","Type":"ContainerDied","Data":"dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd"} Apr 22 20:40:08.871281 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:08.871228 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.53:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.53:8080: connect: connection refused" Apr 22 20:40:09.048385 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:09.048348 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" event={"ID":"760a684c-ddb8-4044-bc2a-fe942d456384","Type":"ContainerStarted","Data":"ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b"} Apr 22 20:40:09.048737 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:09.048644 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:09.050033 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:09.050006 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 20:40:09.063102 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:09.063062 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" podStartSLOduration=7.063049372 podStartE2EDuration="7.063049372s" podCreationTimestamp="2026-04-22 20:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:40:09.061808705 +0000 UTC m=+2497.481715496" watchObservedRunningTime="2026-04-22 20:40:09.063049372 +0000 UTC m=+2497.482956163" Apr 22 20:40:09.929891 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:09.929845 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:40:10.024697 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.024649 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a15399d5-9b43-47dd-a9d9-2a228195b63b-kserve-provision-location\") pod \"a15399d5-9b43-47dd-a9d9-2a228195b63b\" (UID: \"a15399d5-9b43-47dd-a9d9-2a228195b63b\") " Apr 22 20:40:10.025061 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.025038 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a15399d5-9b43-47dd-a9d9-2a228195b63b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a15399d5-9b43-47dd-a9d9-2a228195b63b" (UID: "a15399d5-9b43-47dd-a9d9-2a228195b63b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:40:10.053036 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.052944 2583 generic.go:358] "Generic (PLEG): container finished" podID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerID="ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209" exitCode=0 Apr 22 20:40:10.053036 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.053011 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" Apr 22 20:40:10.053547 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.053035 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" event={"ID":"a15399d5-9b43-47dd-a9d9-2a228195b63b","Type":"ContainerDied","Data":"ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209"} Apr 22 20:40:10.053547 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.053079 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v" event={"ID":"a15399d5-9b43-47dd-a9d9-2a228195b63b","Type":"ContainerDied","Data":"9c1bfff7e7392aabd46bf4b013844764458b17173dec39435d97a07b37ceead1"} Apr 22 20:40:10.053547 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.053096 2583 scope.go:117] "RemoveContainer" containerID="ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209" Apr 22 20:40:10.053728 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.053704 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 20:40:10.062107 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.062090 2583 scope.go:117] "RemoveContainer" containerID="f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224" Apr 22 20:40:10.071113 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.071021 2583 scope.go:117] "RemoveContainer" containerID="ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209" Apr 22 20:40:10.071462 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:40:10.071322 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209\": container with ID starting with ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209 not found: ID does not exist" containerID="ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209" Apr 22 20:40:10.071462 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.071360 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209"} err="failed to get container status \"ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209\": rpc error: code = NotFound desc = could not find container \"ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209\": container with ID starting with ecf80ab0045d7fe1fdaab8d1a31187f79fe00a6b791aa8989f1e69fbcb9b6209 not found: ID does not exist" Apr 22 20:40:10.071462 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.071389 2583 scope.go:117] "RemoveContainer" containerID="f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224" Apr 22 20:40:10.071724 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:40:10.071696 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224\": container with ID starting with f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224 not found: ID does not exist" containerID="f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224" Apr 22 20:40:10.071786 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.071724 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224"} err="failed to get container status \"f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224\": rpc error: code = NotFound desc = could not find container \"f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224\": container with ID starting with f26b37ededdfd1a89f705765eb341e723b9f3c710dcec52a69b7e2486e454224 not found: ID does not exist" Apr 22 20:40:10.073152 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.073129 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v"] Apr 22 20:40:10.076642 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.076622 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-jh95v"] Apr 22 20:40:10.098494 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.098460 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" path="/var/lib/kubelet/pods/a15399d5-9b43-47dd-a9d9-2a228195b63b/volumes" Apr 22 20:40:10.126329 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:10.126290 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a15399d5-9b43-47dd-a9d9-2a228195b63b-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:40:20.054583 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:20.054539 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 22 20:40:30.054581 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:30.054497 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:39.063606 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.063567 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx_760a684c-ddb8-4044-bc2a-fe942d456384/kserve-container/0.log" Apr 22 20:40:39.193921 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.193887 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx"] Apr 22 20:40:39.194194 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.194158 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="kserve-container" containerID="cri-o://ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b" gracePeriod=30 Apr 22 20:40:39.249037 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.249000 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42"] Apr 22 20:40:39.249391 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.249379 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerName="kserve-container" Apr 22 20:40:39.249443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.249393 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerName="kserve-container" Apr 22 20:40:39.249443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.249407 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerName="storage-initializer" Apr 22 20:40:39.249443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.249412 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerName="storage-initializer" Apr 22 20:40:39.249545 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.249474 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="a15399d5-9b43-47dd-a9d9-2a228195b63b" containerName="kserve-container" Apr 22 20:40:39.252689 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.252667 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:40:39.259731 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.259707 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42"] Apr 22 20:40:39.390310 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.390212 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09b493c0-3e4f-43a3-847e-2b875c8ebebe-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42\" (UID: \"09b493c0-3e4f-43a3-847e-2b875c8ebebe\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:40:39.491610 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.491575 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09b493c0-3e4f-43a3-847e-2b875c8ebebe-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42\" (UID: \"09b493c0-3e4f-43a3-847e-2b875c8ebebe\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:40:39.491966 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.491947 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09b493c0-3e4f-43a3-847e-2b875c8ebebe-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42\" (UID: \"09b493c0-3e4f-43a3-847e-2b875c8ebebe\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:40:39.564268 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.564231 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:40:39.700139 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:39.700115 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42"] Apr 22 20:40:39.702539 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:40:39.702506 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b493c0_3e4f_43a3_847e_2b875c8ebebe.slice/crio-07f969b6da4ac3c978161dd1c0f7f49506c55a9f64162b7a636c4f5829e9e662 WatchSource:0}: Error finding container 07f969b6da4ac3c978161dd1c0f7f49506c55a9f64162b7a636c4f5829e9e662: Status 404 returned error can't find the container with id 07f969b6da4ac3c978161dd1c0f7f49506c55a9f64162b7a636c4f5829e9e662 Apr 22 20:40:40.024564 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.024541 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:40.167755 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.167663 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" event={"ID":"09b493c0-3e4f-43a3-847e-2b875c8ebebe","Type":"ContainerStarted","Data":"a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b"} Apr 22 20:40:40.167755 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.167698 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" event={"ID":"09b493c0-3e4f-43a3-847e-2b875c8ebebe","Type":"ContainerStarted","Data":"07f969b6da4ac3c978161dd1c0f7f49506c55a9f64162b7a636c4f5829e9e662"} Apr 22 20:40:40.169295 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.169263 2583 generic.go:358] "Generic (PLEG): container finished" podID="760a684c-ddb8-4044-bc2a-fe942d456384" containerID="ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b" exitCode=0 Apr 22 20:40:40.169438 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.169318 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" Apr 22 20:40:40.169438 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.169318 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" event={"ID":"760a684c-ddb8-4044-bc2a-fe942d456384","Type":"ContainerDied","Data":"ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b"} Apr 22 20:40:40.169569 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.169434 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx" event={"ID":"760a684c-ddb8-4044-bc2a-fe942d456384","Type":"ContainerDied","Data":"87d4d339bf82c71d1f886486f3eb7914493a8761143032ba3ea32b82f18146a8"} Apr 22 20:40:40.169569 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.169465 2583 scope.go:117] "RemoveContainer" containerID="ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b" Apr 22 20:40:40.177677 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.177659 2583 scope.go:117] "RemoveContainer" containerID="dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd" Apr 22 20:40:40.185911 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.185891 2583 scope.go:117] "RemoveContainer" containerID="ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b" Apr 22 20:40:40.186184 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:40:40.186166 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b\": container with ID starting with ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b not found: ID does not exist" containerID="ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b" Apr 22 20:40:40.186247 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.186192 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b"} err="failed to get container status \"ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b\": rpc error: code = NotFound desc = could not find container \"ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b\": container with ID starting with ef5fa9aa2b50411d683c9900d243cdd47932449c1388aef3497d6ef666fe825b not found: ID does not exist" Apr 22 20:40:40.186247 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.186212 2583 scope.go:117] "RemoveContainer" containerID="dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd" Apr 22 20:40:40.186471 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:40:40.186452 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd\": container with ID starting with dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd not found: ID does not exist" containerID="dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd" Apr 22 20:40:40.186518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.186477 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd"} err="failed to get container status \"dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd\": rpc error: code = NotFound desc = could not find container \"dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd\": container with ID starting with dfb846fe676262caa751fb736f94a965710bedb5cb16428aaaa1fee211c744bd not found: ID does not exist" Apr 22 20:40:40.197766 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.197738 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760a684c-ddb8-4044-bc2a-fe942d456384-kserve-provision-location\") pod \"760a684c-ddb8-4044-bc2a-fe942d456384\" (UID: \"760a684c-ddb8-4044-bc2a-fe942d456384\") " Apr 22 20:40:40.223626 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.223590 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760a684c-ddb8-4044-bc2a-fe942d456384-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "760a684c-ddb8-4044-bc2a-fe942d456384" (UID: "760a684c-ddb8-4044-bc2a-fe942d456384"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:40:40.299469 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.299430 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760a684c-ddb8-4044-bc2a-fe942d456384-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:40:40.489293 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.489263 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx"] Apr 22 20:40:40.492702 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:40.492675 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-6d4455cbc5-9jsjx"] Apr 22 20:40:42.102762 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:42.102724 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" path="/var/lib/kubelet/pods/760a684c-ddb8-4044-bc2a-fe942d456384/volumes" Apr 22 20:40:44.185274 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:44.185238 2583 generic.go:358] "Generic (PLEG): container finished" podID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerID="a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b" exitCode=0 Apr 22 20:40:44.185635 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:44.185287 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" event={"ID":"09b493c0-3e4f-43a3-847e-2b875c8ebebe","Type":"ContainerDied","Data":"a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b"} Apr 22 20:40:45.189818 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:45.189784 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" event={"ID":"09b493c0-3e4f-43a3-847e-2b875c8ebebe","Type":"ContainerStarted","Data":"128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de"} Apr 22 20:40:45.190184 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:45.190028 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:40:45.206290 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:40:45.206240 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" podStartSLOduration=6.206225512 podStartE2EDuration="6.206225512s" podCreationTimestamp="2026-04-22 20:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:40:45.204515755 +0000 UTC m=+2533.624422558" watchObservedRunningTime="2026-04-22 20:40:45.206225512 +0000 UTC m=+2533.626132303" Apr 22 20:41:16.197936 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:16.197889 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 20:41:26.195877 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:26.195833 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:41:29.385318 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.385285 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm"] Apr 22 20:41:29.385745 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.385731 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="storage-initializer" Apr 22 20:41:29.385791 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.385747 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="storage-initializer" Apr 22 20:41:29.385791 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.385762 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="kserve-container" Apr 22 20:41:29.385791 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.385768 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="kserve-container" Apr 22 20:41:29.385921 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.385841 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="760a684c-ddb8-4044-bc2a-fe942d456384" containerName="kserve-container" Apr 22 20:41:29.389659 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.389642 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:41:29.397903 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.397853 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm"] Apr 22 20:41:29.426781 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.426752 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42"] Apr 22 20:41:29.427076 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.427038 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerName="kserve-container" containerID="cri-o://128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de" gracePeriod=30 Apr 22 20:41:29.527630 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.527574 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c4b2d2-42a5-4715-9527-717b1b671c3b-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7446f9c785-c9rbm\" (UID: \"a0c4b2d2-42a5-4715-9527-717b1b671c3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:41:29.629088 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.629047 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c4b2d2-42a5-4715-9527-717b1b671c3b-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7446f9c785-c9rbm\" (UID: \"a0c4b2d2-42a5-4715-9527-717b1b671c3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:41:29.629429 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.629409 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c4b2d2-42a5-4715-9527-717b1b671c3b-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7446f9c785-c9rbm\" (UID: \"a0c4b2d2-42a5-4715-9527-717b1b671c3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:41:29.704212 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.704180 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:41:29.832330 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:29.832285 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm"] Apr 22 20:41:29.834787 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:41:29.834745 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c4b2d2_42a5_4715_9527_717b1b671c3b.slice/crio-fbdba90e571d597bc285d8da79cca799fa9a9bcb555e65eb0c38f2bc521b0f63 WatchSource:0}: Error finding container fbdba90e571d597bc285d8da79cca799fa9a9bcb555e65eb0c38f2bc521b0f63: Status 404 returned error can't find the container with id fbdba90e571d597bc285d8da79cca799fa9a9bcb555e65eb0c38f2bc521b0f63 Apr 22 20:41:30.348639 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:30.348600 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" event={"ID":"a0c4b2d2-42a5-4715-9527-717b1b671c3b","Type":"ContainerStarted","Data":"290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb"} Apr 22 20:41:30.348639 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:30.348639 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" event={"ID":"a0c4b2d2-42a5-4715-9527-717b1b671c3b","Type":"ContainerStarted","Data":"fbdba90e571d597bc285d8da79cca799fa9a9bcb555e65eb0c38f2bc521b0f63"} Apr 22 20:41:34.364770 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:34.364728 2583 generic.go:358] "Generic (PLEG): container finished" podID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerID="290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb" exitCode=0 Apr 22 20:41:34.365184 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:34.364805 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" event={"ID":"a0c4b2d2-42a5-4715-9527-717b1b671c3b","Type":"ContainerDied","Data":"290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb"} Apr 22 20:41:35.369623 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:35.369588 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" event={"ID":"a0c4b2d2-42a5-4715-9527-717b1b671c3b","Type":"ContainerStarted","Data":"4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2"} Apr 22 20:41:35.370045 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:35.369895 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:41:35.371157 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:35.371131 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:41:35.385156 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:35.385107 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podStartSLOduration=6.385094638 podStartE2EDuration="6.385094638s" podCreationTimestamp="2026-04-22 20:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:41:35.383357845 +0000 UTC m=+2583.803264636" watchObservedRunningTime="2026-04-22 20:41:35.385094638 +0000 UTC m=+2583.805001489" Apr 22 20:41:36.193751 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:36.193700 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.55:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.134.0.55:8080: connect: connection refused" Apr 22 20:41:36.375816 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:36.375777 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:41:36.871180 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:36.871158 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:41:36.892078 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:36.892047 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09b493c0-3e4f-43a3-847e-2b875c8ebebe-kserve-provision-location\") pod \"09b493c0-3e4f-43a3-847e-2b875c8ebebe\" (UID: \"09b493c0-3e4f-43a3-847e-2b875c8ebebe\") " Apr 22 20:41:36.892341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:36.892319 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09b493c0-3e4f-43a3-847e-2b875c8ebebe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "09b493c0-3e4f-43a3-847e-2b875c8ebebe" (UID: "09b493c0-3e4f-43a3-847e-2b875c8ebebe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:41:36.993076 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:36.992998 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/09b493c0-3e4f-43a3-847e-2b875c8ebebe-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:41:37.379915 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.379804 2583 generic.go:358] "Generic (PLEG): container finished" podID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerID="128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de" exitCode=0 Apr 22 20:41:37.379915 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.379902 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" Apr 22 20:41:37.380343 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.379899 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" event={"ID":"09b493c0-3e4f-43a3-847e-2b875c8ebebe","Type":"ContainerDied","Data":"128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de"} Apr 22 20:41:37.380343 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.380002 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42" event={"ID":"09b493c0-3e4f-43a3-847e-2b875c8ebebe","Type":"ContainerDied","Data":"07f969b6da4ac3c978161dd1c0f7f49506c55a9f64162b7a636c4f5829e9e662"} Apr 22 20:41:37.380343 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.380018 2583 scope.go:117] "RemoveContainer" containerID="128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de" Apr 22 20:41:37.388583 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.388562 2583 scope.go:117] "RemoveContainer" containerID="a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b" Apr 22 20:41:37.396075 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.396058 2583 scope.go:117] "RemoveContainer" containerID="128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de" Apr 22 20:41:37.396320 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:41:37.396301 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de\": container with ID starting with 128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de not found: ID does not exist" containerID="128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de" Apr 22 20:41:37.396387 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.396326 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de"} err="failed to get container status \"128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de\": rpc error: code = NotFound desc = could not find container \"128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de\": container with ID starting with 128d978eeb19de6ea2795c7df6a9cf46e58e80b41296aa3042971f2ca3ba30de not found: ID does not exist" Apr 22 20:41:37.396387 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.396342 2583 scope.go:117] "RemoveContainer" containerID="a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b" Apr 22 20:41:37.396596 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:41:37.396580 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b\": container with ID starting with a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b not found: ID does not exist" containerID="a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b" Apr 22 20:41:37.396640 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.396599 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b"} err="failed to get container status \"a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b\": rpc error: code = NotFound desc = could not find container \"a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b\": container with ID starting with a3ce6f9f11b8a8daca4399f6fbac0e49fcc0db5e5864183cf3c6f378703e908b not found: ID does not exist" Apr 22 20:41:37.400759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.400738 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42"] Apr 22 20:41:37.405670 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:37.405648 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-rjh42"] Apr 22 20:41:38.096233 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:38.096200 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" path="/var/lib/kubelet/pods/09b493c0-3e4f-43a3-847e-2b875c8ebebe/volumes" Apr 22 20:41:46.376429 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:46.376386 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:41:56.376340 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:41:56.376300 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:42:06.376296 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:06.376252 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:42:16.375988 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:16.375938 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:42:26.376392 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:26.376346 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:42:36.376063 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:36.376019 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:42:39.092264 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:39.092223 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 22 20:42:49.093562 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.093529 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:42:49.568715 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.568680 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm"] Apr 22 20:42:49.625072 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.625040 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9"] Apr 22 20:42:49.625413 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.625400 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerName="storage-initializer" Apr 22 20:42:49.625458 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.625414 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerName="storage-initializer" Apr 22 20:42:49.625458 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.625437 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerName="kserve-container" Apr 22 20:42:49.625458 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.625442 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerName="kserve-container" Apr 22 20:42:49.625557 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.625501 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="09b493c0-3e4f-43a3-847e-2b875c8ebebe" containerName="kserve-container" Apr 22 20:42:49.628784 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.628763 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:42:49.630018 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.629965 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" containerID="cri-o://4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2" gracePeriod=30 Apr 22 20:42:49.636165 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.636142 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9"] Apr 22 20:42:49.758355 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.758318 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c2c757-08f5-49b0-96b1-4e0d97b3fb46-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9\" (UID: \"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:42:49.859427 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.859338 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c2c757-08f5-49b0-96b1-4e0d97b3fb46-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9\" (UID: \"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:42:49.859729 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.859707 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c2c757-08f5-49b0-96b1-4e0d97b3fb46-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9\" (UID: \"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:42:49.940142 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:49.940105 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:42:50.066366 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:50.066341 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9"] Apr 22 20:42:50.068490 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:42:50.068459 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c2c757_08f5_49b0_96b1_4e0d97b3fb46.slice/crio-819af4fdb3334823efa245a58320dba0203604455806b8e133344efa57ffa729 WatchSource:0}: Error finding container 819af4fdb3334823efa245a58320dba0203604455806b8e133344efa57ffa729: Status 404 returned error can't find the container with id 819af4fdb3334823efa245a58320dba0203604455806b8e133344efa57ffa729 Apr 22 20:42:50.638003 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:50.637959 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" event={"ID":"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46","Type":"ContainerStarted","Data":"27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f"} Apr 22 20:42:50.638003 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:50.638004 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" event={"ID":"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46","Type":"ContainerStarted","Data":"819af4fdb3334823efa245a58320dba0203604455806b8e133344efa57ffa729"} Apr 22 20:42:53.649147 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:53.649119 2583 generic.go:358] "Generic (PLEG): container finished" podID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerID="27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f" exitCode=0 Apr 22 20:42:53.649496 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:53.649164 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" event={"ID":"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46","Type":"ContainerDied","Data":"27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f"} Apr 22 20:42:53.650213 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:53.650190 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:42:54.084471 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.084445 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:42:54.195694 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.195665 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c4b2d2-42a5-4715-9527-717b1b671c3b-kserve-provision-location\") pod \"a0c4b2d2-42a5-4715-9527-717b1b671c3b\" (UID: \"a0c4b2d2-42a5-4715-9527-717b1b671c3b\") " Apr 22 20:42:54.196030 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.196006 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c4b2d2-42a5-4715-9527-717b1b671c3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a0c4b2d2-42a5-4715-9527-717b1b671c3b" (UID: "a0c4b2d2-42a5-4715-9527-717b1b671c3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:42:54.297007 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.296975 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c4b2d2-42a5-4715-9527-717b1b671c3b-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:42:54.654085 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.653995 2583 generic.go:358] "Generic (PLEG): container finished" podID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerID="4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2" exitCode=0 Apr 22 20:42:54.654085 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.654060 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" Apr 22 20:42:54.654085 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.654074 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" event={"ID":"a0c4b2d2-42a5-4715-9527-717b1b671c3b","Type":"ContainerDied","Data":"4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2"} Apr 22 20:42:54.654619 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.654113 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm" event={"ID":"a0c4b2d2-42a5-4715-9527-717b1b671c3b","Type":"ContainerDied","Data":"fbdba90e571d597bc285d8da79cca799fa9a9bcb555e65eb0c38f2bc521b0f63"} Apr 22 20:42:54.654619 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.654129 2583 scope.go:117] "RemoveContainer" containerID="4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2" Apr 22 20:42:54.655704 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.655645 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" event={"ID":"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46","Type":"ContainerStarted","Data":"c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7"} Apr 22 20:42:54.655938 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.655917 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:42:54.657229 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.657206 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:42:54.663435 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.663419 2583 scope.go:117] "RemoveContainer" containerID="290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb" Apr 22 20:42:54.670441 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.670424 2583 scope.go:117] "RemoveContainer" containerID="4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2" Apr 22 20:42:54.670672 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:42:54.670654 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2\": container with ID starting with 4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2 not found: ID does not exist" containerID="4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2" Apr 22 20:42:54.670719 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.670680 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2"} err="failed to get container status \"4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2\": rpc error: code = NotFound desc = could not find container \"4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2\": container with ID starting with 4d0c2ee3711192bd22b5d93aacad7f54123d763b74332bf1c9eb95baae06b7c2 not found: ID does not exist" Apr 22 20:42:54.670719 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.670696 2583 scope.go:117] "RemoveContainer" containerID="290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb" Apr 22 20:42:54.670916 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:42:54.670899 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb\": container with ID starting with 290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb not found: ID does not exist" containerID="290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb" Apr 22 20:42:54.670968 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.670925 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb"} err="failed to get container status \"290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb\": rpc error: code = NotFound desc = could not find container \"290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb\": container with ID starting with 290d618ee235c3c5cb692480ce4b34471ea1597465504c2247abe5f08a4996cb not found: ID does not exist" Apr 22 20:42:54.679666 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.679629 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podStartSLOduration=5.679619171 podStartE2EDuration="5.679619171s" podCreationTimestamp="2026-04-22 20:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:42:54.677372091 +0000 UTC m=+2663.097278883" watchObservedRunningTime="2026-04-22 20:42:54.679619171 +0000 UTC m=+2663.099525962" Apr 22 20:42:54.689805 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.689781 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm"] Apr 22 20:42:54.695473 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:54.695451 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7446f9c785-c9rbm"] Apr 22 20:42:55.660046 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:55.660006 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:42:56.097398 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:42:56.097366 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" path="/var/lib/kubelet/pods/a0c4b2d2-42a5-4715-9527-717b1b671c3b/volumes" Apr 22 20:43:05.660126 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:43:05.660085 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:43:15.660927 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:43:15.660882 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:43:25.660110 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:43:25.660011 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:43:35.660889 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:43:35.660822 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:43:45.661028 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:43:45.660976 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:43:55.660909 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:43:55.660834 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:43:58.093001 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:43:58.092955 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 22 20:44:08.097125 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:08.097090 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:44:09.761370 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:09.761329 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9"] Apr 22 20:44:09.761801 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:09.761704 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" containerID="cri-o://c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7" gracePeriod=30 Apr 22 20:44:14.206924 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.206894 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:44:14.276606 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.276520 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c2c757-08f5-49b0-96b1-4e0d97b3fb46-kserve-provision-location\") pod \"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46\" (UID: \"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46\") " Apr 22 20:44:14.276757 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.276707 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c2c757-08f5-49b0-96b1-4e0d97b3fb46-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" (UID: "e3c2c757-08f5-49b0-96b1-4e0d97b3fb46"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:44:14.276911 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.276896 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e3c2c757-08f5-49b0-96b1-4e0d97b3fb46-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:44:14.944560 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.944525 2583 generic.go:358] "Generic (PLEG): container finished" podID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerID="c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7" exitCode=0 Apr 22 20:44:14.944786 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.944601 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" Apr 22 20:44:14.944786 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.944618 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" event={"ID":"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46","Type":"ContainerDied","Data":"c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7"} Apr 22 20:44:14.944786 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.944669 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9" event={"ID":"e3c2c757-08f5-49b0-96b1-4e0d97b3fb46","Type":"ContainerDied","Data":"819af4fdb3334823efa245a58320dba0203604455806b8e133344efa57ffa729"} Apr 22 20:44:14.944786 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.944693 2583 scope.go:117] "RemoveContainer" containerID="c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7" Apr 22 20:44:14.953778 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.953761 2583 scope.go:117] "RemoveContainer" containerID="27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f" Apr 22 20:44:14.961488 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.961468 2583 scope.go:117] "RemoveContainer" containerID="c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7" Apr 22 20:44:14.961757 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:44:14.961737 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7\": container with ID starting with c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7 not found: ID does not exist" containerID="c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7" Apr 22 20:44:14.961834 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.961766 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7"} err="failed to get container status \"c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7\": rpc error: code = NotFound desc = could not find container \"c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7\": container with ID starting with c8f63d7fd5b2039570058026afe06f0c094eff0c9a82779410e270c3256422e7 not found: ID does not exist" Apr 22 20:44:14.961967 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.961832 2583 scope.go:117] "RemoveContainer" containerID="27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f" Apr 22 20:44:14.962631 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:44:14.962609 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f\": container with ID starting with 27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f not found: ID does not exist" containerID="27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f" Apr 22 20:44:14.962733 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.962637 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f"} err="failed to get container status \"27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f\": rpc error: code = NotFound desc = could not find container \"27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f\": container with ID starting with 27abf120d4ac4fb0a53f87c03f2604eeb47adba9aa070c06b7849d1cdf34749f not found: ID does not exist" Apr 22 20:44:14.964292 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.964268 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9"] Apr 22 20:44:14.967616 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:14.967595 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-5448f48998-wv4s9"] Apr 22 20:44:16.096603 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:44:16.096565 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" path="/var/lib/kubelet/pods/e3c2c757-08f5-49b0-96b1-4e0d97b3fb46/volumes" Apr 22 20:49:12.359883 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.359833 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt"] Apr 22 20:49:12.360518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360390 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" Apr 22 20:49:12.360518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360411 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" Apr 22 20:49:12.360518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360440 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" Apr 22 20:49:12.360518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360449 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" Apr 22 20:49:12.360518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360459 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="storage-initializer" Apr 22 20:49:12.360518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360468 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="storage-initializer" Apr 22 20:49:12.360518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360480 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="storage-initializer" Apr 22 20:49:12.360518 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360489 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="storage-initializer" Apr 22 20:49:12.360981 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360580 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0c4b2d2-42a5-4715-9527-717b1b671c3b" containerName="kserve-container" Apr 22 20:49:12.360981 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.360597 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3c2c757-08f5-49b0-96b1-4e0d97b3fb46" containerName="kserve-container" Apr 22 20:49:12.363928 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.363905 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:49:12.365918 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.365894 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:49:12.369789 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.369765 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt"] Apr 22 20:49:12.463013 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.462980 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9711d422-bc22-452f-9658-d960de23c270-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt\" (UID: \"9711d422-bc22-452f-9658-d960de23c270\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:49:12.564530 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.564487 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9711d422-bc22-452f-9658-d960de23c270-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt\" (UID: \"9711d422-bc22-452f-9658-d960de23c270\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:49:12.564840 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.564822 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9711d422-bc22-452f-9658-d960de23c270-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt\" (UID: \"9711d422-bc22-452f-9658-d960de23c270\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:49:12.675315 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.675241 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:49:12.800421 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.800388 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt"] Apr 22 20:49:12.804561 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:49:12.804530 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9711d422_bc22_452f_9658_d960de23c270.slice/crio-a333f86f8e99910a9fbf197ca3828990dfa9ea7926d41ce5f68de2b74d1fc64e WatchSource:0}: Error finding container a333f86f8e99910a9fbf197ca3828990dfa9ea7926d41ce5f68de2b74d1fc64e: Status 404 returned error can't find the container with id a333f86f8e99910a9fbf197ca3828990dfa9ea7926d41ce5f68de2b74d1fc64e Apr 22 20:49:12.806449 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.806434 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:49:12.961855 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.961824 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" event={"ID":"9711d422-bc22-452f-9658-d960de23c270","Type":"ContainerStarted","Data":"e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712"} Apr 22 20:49:12.962073 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:12.961882 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" event={"ID":"9711d422-bc22-452f-9658-d960de23c270","Type":"ContainerStarted","Data":"a333f86f8e99910a9fbf197ca3828990dfa9ea7926d41ce5f68de2b74d1fc64e"} Apr 22 20:49:16.976673 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:16.976636 2583 generic.go:358] "Generic (PLEG): container finished" podID="9711d422-bc22-452f-9658-d960de23c270" containerID="e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712" exitCode=0 Apr 22 20:49:16.977095 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:16.976708 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" event={"ID":"9711d422-bc22-452f-9658-d960de23c270","Type":"ContainerDied","Data":"e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712"} Apr 22 20:49:17.982135 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:17.982097 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" event={"ID":"9711d422-bc22-452f-9658-d960de23c270","Type":"ContainerStarted","Data":"c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5"} Apr 22 20:49:17.982634 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:17.982347 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:49:17.998275 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:17.998218 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" podStartSLOduration=5.998201481 podStartE2EDuration="5.998201481s" podCreationTimestamp="2026-04-22 20:49:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:49:17.995880433 +0000 UTC m=+3046.415787226" watchObservedRunningTime="2026-04-22 20:49:17.998201481 +0000 UTC m=+3046.418108273" Apr 22 20:49:48.993949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:48.993850 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:49:52.421287 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.421253 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt"] Apr 22 20:49:52.421663 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.421542 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" podUID="9711d422-bc22-452f-9658-d960de23c270" containerName="kserve-container" containerID="cri-o://c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5" gracePeriod=30 Apr 22 20:49:52.486246 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.486209 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9"] Apr 22 20:49:52.490112 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.490091 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:49:52.498939 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.498911 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9"] Apr 22 20:49:52.608952 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.608915 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/356b8cfd-4a3f-4ca4-8a17-0d32c62adce7-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-p49j9\" (UID: \"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:49:52.709626 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.709597 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/356b8cfd-4a3f-4ca4-8a17-0d32c62adce7-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-p49j9\" (UID: \"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:49:52.709974 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.709956 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/356b8cfd-4a3f-4ca4-8a17-0d32c62adce7-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-p49j9\" (UID: \"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:49:52.801541 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.801511 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:49:52.925133 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:52.925103 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9"] Apr 22 20:49:52.926983 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:49:52.926951 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod356b8cfd_4a3f_4ca4_8a17_0d32c62adce7.slice/crio-04952a0d6c0a10b92a216807c75d7cbb99d84a82d8c0ad4da1846eda2978e08d WatchSource:0}: Error finding container 04952a0d6c0a10b92a216807c75d7cbb99d84a82d8c0ad4da1846eda2978e08d: Status 404 returned error can't find the container with id 04952a0d6c0a10b92a216807c75d7cbb99d84a82d8c0ad4da1846eda2978e08d Apr 22 20:49:53.101568 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:53.101527 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" event={"ID":"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7","Type":"ContainerStarted","Data":"e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1"} Apr 22 20:49:53.101568 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:53.101572 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" event={"ID":"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7","Type":"ContainerStarted","Data":"04952a0d6c0a10b92a216807c75d7cbb99d84a82d8c0ad4da1846eda2978e08d"} Apr 22 20:49:57.115023 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:57.114989 2583 generic.go:358] "Generic (PLEG): container finished" podID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerID="e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1" exitCode=0 Apr 22 20:49:57.115432 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:57.115062 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" event={"ID":"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7","Type":"ContainerDied","Data":"e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1"} Apr 22 20:49:58.119787 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:58.119751 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" event={"ID":"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7","Type":"ContainerStarted","Data":"a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676"} Apr 22 20:49:58.120306 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:58.120019 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:49:58.136367 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:58.136324 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" podStartSLOduration=6.136311264 podStartE2EDuration="6.136311264s" podCreationTimestamp="2026-04-22 20:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:49:58.135056754 +0000 UTC m=+3086.554963579" watchObservedRunningTime="2026-04-22 20:49:58.136311264 +0000 UTC m=+3086.556218121" Apr 22 20:49:58.986279 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:58.986239 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" podUID="9711d422-bc22-452f-9658-d960de23c270" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.58:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.58:8080: connect: connection refused" Apr 22 20:49:59.974629 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:49:59.974602 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:50:00.075428 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.075339 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9711d422-bc22-452f-9658-d960de23c270-kserve-provision-location\") pod \"9711d422-bc22-452f-9658-d960de23c270\" (UID: \"9711d422-bc22-452f-9658-d960de23c270\") " Apr 22 20:50:00.075677 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.075653 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9711d422-bc22-452f-9658-d960de23c270-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9711d422-bc22-452f-9658-d960de23c270" (UID: "9711d422-bc22-452f-9658-d960de23c270"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:50:00.127758 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.127724 2583 generic.go:358] "Generic (PLEG): container finished" podID="9711d422-bc22-452f-9658-d960de23c270" containerID="c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5" exitCode=0 Apr 22 20:50:00.127940 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.127791 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" Apr 22 20:50:00.127940 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.127813 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" event={"ID":"9711d422-bc22-452f-9658-d960de23c270","Type":"ContainerDied","Data":"c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5"} Apr 22 20:50:00.127940 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.127883 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt" event={"ID":"9711d422-bc22-452f-9658-d960de23c270","Type":"ContainerDied","Data":"a333f86f8e99910a9fbf197ca3828990dfa9ea7926d41ce5f68de2b74d1fc64e"} Apr 22 20:50:00.127940 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.127902 2583 scope.go:117] "RemoveContainer" containerID="c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5" Apr 22 20:50:00.136260 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.136238 2583 scope.go:117] "RemoveContainer" containerID="e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712" Apr 22 20:50:00.143696 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.143674 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt"] Apr 22 20:50:00.144195 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.144181 2583 scope.go:117] "RemoveContainer" containerID="c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5" Apr 22 20:50:00.144466 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:50:00.144447 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5\": container with ID starting with c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5 not found: ID does not exist" containerID="c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5" Apr 22 20:50:00.144512 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.144476 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5"} err="failed to get container status \"c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5\": rpc error: code = NotFound desc = could not find container \"c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5\": container with ID starting with c59e394e87dc4c5549bfdbc887d2d304b8ea117979ef3bc1c7b13f5b72fc9de5 not found: ID does not exist" Apr 22 20:50:00.144512 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.144496 2583 scope.go:117] "RemoveContainer" containerID="e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712" Apr 22 20:50:00.144766 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:50:00.144744 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712\": container with ID starting with e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712 not found: ID does not exist" containerID="e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712" Apr 22 20:50:00.144883 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.144772 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712"} err="failed to get container status \"e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712\": rpc error: code = NotFound desc = could not find container \"e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712\": container with ID starting with e9bfabd88af0ab76b9ae496d285b5e4a51e275f14123db986f4da0077d4ad712 not found: ID does not exist" Apr 22 20:50:00.147009 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.146988 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-jt4tt"] Apr 22 20:50:00.176638 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:00.176613 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9711d422-bc22-452f-9658-d960de23c270-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:50:02.097451 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:02.097420 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9711d422-bc22-452f-9658-d960de23c270" path="/var/lib/kubelet/pods/9711d422-bc22-452f-9658-d960de23c270/volumes" Apr 22 20:50:29.194504 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:29.194472 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:50:32.716259 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:32.716217 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9"] Apr 22 20:50:32.716733 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:32.716620 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" podUID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerName="kserve-container" containerID="cri-o://a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676" gracePeriod=30 Apr 22 20:50:39.123855 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:39.123810 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" podUID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.59:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.134.0.59:8080: connect: connection refused" Apr 22 20:50:40.567501 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:40.567478 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:50:40.630572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:40.630543 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/356b8cfd-4a3f-4ca4-8a17-0d32c62adce7-kserve-provision-location\") pod \"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7\" (UID: \"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7\") " Apr 22 20:50:40.630913 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:40.630892 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356b8cfd-4a3f-4ca4-8a17-0d32c62adce7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" (UID: "356b8cfd-4a3f-4ca4-8a17-0d32c62adce7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:50:40.731477 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:40.731441 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/356b8cfd-4a3f-4ca4-8a17-0d32c62adce7-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:50:41.277123 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.277085 2583 generic.go:358] "Generic (PLEG): container finished" podID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerID="a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676" exitCode=0 Apr 22 20:50:41.277301 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.277159 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" Apr 22 20:50:41.277301 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.277162 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" event={"ID":"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7","Type":"ContainerDied","Data":"a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676"} Apr 22 20:50:41.277301 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.277204 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9" event={"ID":"356b8cfd-4a3f-4ca4-8a17-0d32c62adce7","Type":"ContainerDied","Data":"04952a0d6c0a10b92a216807c75d7cbb99d84a82d8c0ad4da1846eda2978e08d"} Apr 22 20:50:41.277301 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.277219 2583 scope.go:117] "RemoveContainer" containerID="a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676" Apr 22 20:50:41.286098 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.285942 2583 scope.go:117] "RemoveContainer" containerID="e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1" Apr 22 20:50:41.293303 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.293285 2583 scope.go:117] "RemoveContainer" containerID="a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676" Apr 22 20:50:41.293559 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:50:41.293538 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676\": container with ID starting with a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676 not found: ID does not exist" containerID="a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676" Apr 22 20:50:41.293599 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.293568 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676"} err="failed to get container status \"a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676\": rpc error: code = NotFound desc = could not find container \"a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676\": container with ID starting with a34af599b8968e008acc2595f622e9fc5d47784aac08a32696217b457cc87676 not found: ID does not exist" Apr 22 20:50:41.293599 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.293585 2583 scope.go:117] "RemoveContainer" containerID="e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1" Apr 22 20:50:41.293808 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:50:41.293792 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1\": container with ID starting with e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1 not found: ID does not exist" containerID="e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1" Apr 22 20:50:41.293851 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.293816 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1"} err="failed to get container status \"e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1\": rpc error: code = NotFound desc = could not find container \"e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1\": container with ID starting with e74e9576d38baa65b7605180eaefb725a1e6ea3a5ff6ec19593c4029e49de9c1 not found: ID does not exist" Apr 22 20:50:41.297363 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.297342 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9"] Apr 22 20:50:41.300646 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:41.300622 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-p49j9"] Apr 22 20:50:42.096307 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:50:42.096274 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" path="/var/lib/kubelet/pods/356b8cfd-4a3f-4ca4-8a17-0d32c62adce7/volumes" Apr 22 20:51:42.967454 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967415 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq"] Apr 22 20:51:42.967879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967775 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9711d422-bc22-452f-9658-d960de23c270" containerName="kserve-container" Apr 22 20:51:42.967879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967787 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9711d422-bc22-452f-9658-d960de23c270" containerName="kserve-container" Apr 22 20:51:42.967879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967802 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9711d422-bc22-452f-9658-d960de23c270" containerName="storage-initializer" Apr 22 20:51:42.967879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967808 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="9711d422-bc22-452f-9658-d960de23c270" containerName="storage-initializer" Apr 22 20:51:42.967879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967820 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerName="storage-initializer" Apr 22 20:51:42.967879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967826 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerName="storage-initializer" Apr 22 20:51:42.967879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967834 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerName="kserve-container" Apr 22 20:51:42.967879 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967839 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerName="kserve-container" Apr 22 20:51:42.968140 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967907 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="356b8cfd-4a3f-4ca4-8a17-0d32c62adce7" containerName="kserve-container" Apr 22 20:51:42.968140 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.967919 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="9711d422-bc22-452f-9658-d960de23c270" containerName="kserve-container" Apr 22 20:51:42.971134 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.971112 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:51:42.973187 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.973162 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:51:42.979625 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:42.979603 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq"] Apr 22 20:51:43.073790 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:43.073748 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a9fe91-5f06-471b-adf7-1165d66c9014-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq\" (UID: \"83a9fe91-5f06-471b-adf7-1165d66c9014\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:51:43.175085 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:43.175048 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a9fe91-5f06-471b-adf7-1165d66c9014-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq\" (UID: \"83a9fe91-5f06-471b-adf7-1165d66c9014\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:51:43.175496 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:43.175473 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a9fe91-5f06-471b-adf7-1165d66c9014-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq\" (UID: \"83a9fe91-5f06-471b-adf7-1165d66c9014\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:51:43.282399 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:43.282305 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:51:43.403452 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:43.403427 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq"] Apr 22 20:51:43.406007 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:51:43.405975 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a9fe91_5f06_471b_adf7_1165d66c9014.slice/crio-68fd7f6eacfd3ad0438397baa3ee4abeb4666720dbe0ca26f078e33a2bf3db72 WatchSource:0}: Error finding container 68fd7f6eacfd3ad0438397baa3ee4abeb4666720dbe0ca26f078e33a2bf3db72: Status 404 returned error can't find the container with id 68fd7f6eacfd3ad0438397baa3ee4abeb4666720dbe0ca26f078e33a2bf3db72 Apr 22 20:51:43.491146 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:43.491113 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" event={"ID":"83a9fe91-5f06-471b-adf7-1165d66c9014","Type":"ContainerStarted","Data":"7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f"} Apr 22 20:51:43.491146 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:43.491153 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" event={"ID":"83a9fe91-5f06-471b-adf7-1165d66c9014","Type":"ContainerStarted","Data":"68fd7f6eacfd3ad0438397baa3ee4abeb4666720dbe0ca26f078e33a2bf3db72"} Apr 22 20:51:47.506878 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:47.506817 2583 generic.go:358] "Generic (PLEG): container finished" podID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerID="7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f" exitCode=0 Apr 22 20:51:47.507288 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:47.506890 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" event={"ID":"83a9fe91-5f06-471b-adf7-1165d66c9014","Type":"ContainerDied","Data":"7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f"} Apr 22 20:51:48.512699 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:48.512656 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" event={"ID":"83a9fe91-5f06-471b-adf7-1165d66c9014","Type":"ContainerStarted","Data":"ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3"} Apr 22 20:51:48.513335 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:48.512895 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:51:48.527586 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:51:48.527522 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" podStartSLOduration=6.527502919 podStartE2EDuration="6.527502919s" podCreationTimestamp="2026-04-22 20:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:51:48.526398993 +0000 UTC m=+3196.946305799" watchObservedRunningTime="2026-04-22 20:51:48.527502919 +0000 UTC m=+3196.947409711" Apr 22 20:52:19.593929 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:19.593876 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 20:52:29.518462 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:29.518376 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:52:33.038465 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:33.038422 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq"] Apr 22 20:52:33.038947 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:33.038805 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerName="kserve-container" containerID="cri-o://ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3" gracePeriod=30 Apr 22 20:52:39.516610 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:39.516562 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.60:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.60:8080: connect: connection refused" Apr 22 20:52:40.584728 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.584705 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:52:40.672766 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.672675 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a9fe91-5f06-471b-adf7-1165d66c9014-kserve-provision-location\") pod \"83a9fe91-5f06-471b-adf7-1165d66c9014\" (UID: \"83a9fe91-5f06-471b-adf7-1165d66c9014\") " Apr 22 20:52:40.673010 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.672987 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83a9fe91-5f06-471b-adf7-1165d66c9014-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83a9fe91-5f06-471b-adf7-1165d66c9014" (UID: "83a9fe91-5f06-471b-adf7-1165d66c9014"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:52:40.697540 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.697507 2583 generic.go:358] "Generic (PLEG): container finished" podID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerID="ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3" exitCode=0 Apr 22 20:52:40.697680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.697577 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" Apr 22 20:52:40.697680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.697589 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" event={"ID":"83a9fe91-5f06-471b-adf7-1165d66c9014","Type":"ContainerDied","Data":"ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3"} Apr 22 20:52:40.697680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.697630 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq" event={"ID":"83a9fe91-5f06-471b-adf7-1165d66c9014","Type":"ContainerDied","Data":"68fd7f6eacfd3ad0438397baa3ee4abeb4666720dbe0ca26f078e33a2bf3db72"} Apr 22 20:52:40.697680 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.697651 2583 scope.go:117] "RemoveContainer" containerID="ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3" Apr 22 20:52:40.709880 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.709841 2583 scope.go:117] "RemoveContainer" containerID="7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f" Apr 22 20:52:40.718069 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.718051 2583 scope.go:117] "RemoveContainer" containerID="ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3" Apr 22 20:52:40.718310 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:52:40.718293 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3\": container with ID starting with ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3 not found: ID does not exist" containerID="ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3" Apr 22 20:52:40.718372 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.718317 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3"} err="failed to get container status \"ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3\": rpc error: code = NotFound desc = could not find container \"ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3\": container with ID starting with ce550ae5ca2080f638c4cbd7c9c84d67d322c66182a7ebafbb48aeeaa69709e3 not found: ID does not exist" Apr 22 20:52:40.718372 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.718346 2583 scope.go:117] "RemoveContainer" containerID="7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f" Apr 22 20:52:40.718581 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:52:40.718567 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f\": container with ID starting with 7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f not found: ID does not exist" containerID="7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f" Apr 22 20:52:40.718630 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.718585 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f"} err="failed to get container status \"7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f\": rpc error: code = NotFound desc = could not find container \"7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f\": container with ID starting with 7308d24aee5e3d56535aac1460224048bfe0807860c29f5416371f2f69caaa1f not found: ID does not exist" Apr 22 20:52:40.724629 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.724608 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq"] Apr 22 20:52:40.727955 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.727929 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-5mskq"] Apr 22 20:52:40.774228 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:40.774204 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83a9fe91-5f06-471b-adf7-1165d66c9014-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:52:42.096755 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:52:42.096720 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" path="/var/lib/kubelet/pods/83a9fe91-5f06-471b-adf7-1165d66c9014/volumes" Apr 22 20:53:43.253738 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.253703 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5"] Apr 22 20:53:43.254278 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.254259 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerName="storage-initializer" Apr 22 20:53:43.254357 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.254283 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerName="storage-initializer" Apr 22 20:53:43.254357 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.254303 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerName="kserve-container" Apr 22 20:53:43.254357 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.254311 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerName="kserve-container" Apr 22 20:53:43.254602 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.254581 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="83a9fe91-5f06-471b-adf7-1165d66c9014" containerName="kserve-container" Apr 22 20:53:43.257711 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.257693 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:53:43.259693 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.259674 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 22 20:53:43.259790 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.259698 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-t9c5k\"" Apr 22 20:53:43.262766 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.262747 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5"] Apr 22 20:53:43.411724 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.411687 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba06dbc0-da35-4a43-bcd8-60c92f4db382-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-66b4b49586-kxqw5\" (UID: \"ba06dbc0-da35-4a43-bcd8-60c92f4db382\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:53:43.513063 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.512969 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba06dbc0-da35-4a43-bcd8-60c92f4db382-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-66b4b49586-kxqw5\" (UID: \"ba06dbc0-da35-4a43-bcd8-60c92f4db382\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:53:43.513331 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.513311 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba06dbc0-da35-4a43-bcd8-60c92f4db382-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-66b4b49586-kxqw5\" (UID: \"ba06dbc0-da35-4a43-bcd8-60c92f4db382\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:53:43.570015 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.569987 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:53:43.691371 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.691338 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5"] Apr 22 20:53:43.693403 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:53:43.693377 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba06dbc0_da35_4a43_bcd8_60c92f4db382.slice/crio-f492b68c6a1bbf75633da622d360aedb7f93feacbca2dfe0acefb2470e49d3b8 WatchSource:0}: Error finding container f492b68c6a1bbf75633da622d360aedb7f93feacbca2dfe0acefb2470e49d3b8: Status 404 returned error can't find the container with id f492b68c6a1bbf75633da622d360aedb7f93feacbca2dfe0acefb2470e49d3b8 Apr 22 20:53:43.927398 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.927301 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" event={"ID":"ba06dbc0-da35-4a43-bcd8-60c92f4db382","Type":"ContainerStarted","Data":"e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840"} Apr 22 20:53:43.927398 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:43.927350 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" event={"ID":"ba06dbc0-da35-4a43-bcd8-60c92f4db382","Type":"ContainerStarted","Data":"f492b68c6a1bbf75633da622d360aedb7f93feacbca2dfe0acefb2470e49d3b8"} Apr 22 20:53:44.931720 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:44.931683 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerID="e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840" exitCode=0 Apr 22 20:53:44.932145 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:44.931777 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" event={"ID":"ba06dbc0-da35-4a43-bcd8-60c92f4db382","Type":"ContainerDied","Data":"e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840"} Apr 22 20:53:45.943489 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:45.943449 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" event={"ID":"ba06dbc0-da35-4a43-bcd8-60c92f4db382","Type":"ContainerStarted","Data":"74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6"} Apr 22 20:53:45.943970 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:45.943601 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:53:45.945039 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:45.945014 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 22 20:53:45.958847 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:45.958801 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podStartSLOduration=2.958787673 podStartE2EDuration="2.958787673s" podCreationTimestamp="2026-04-22 20:53:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:53:45.957209204 +0000 UTC m=+3314.377115995" watchObservedRunningTime="2026-04-22 20:53:45.958787673 +0000 UTC m=+3314.378694464" Apr 22 20:53:46.946902 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:46.946844 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 22 20:53:56.947318 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:53:56.947223 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 22 20:54:06.947376 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:06.947329 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 22 20:54:16.947010 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:16.946964 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 22 20:54:26.947237 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:26.947185 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 22 20:54:36.947684 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:36.947635 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 22 20:54:46.947854 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:46.947802 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 22 20:54:51.093037 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:51.093001 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:54:53.356102 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.356069 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5"] Apr 22 20:54:53.356952 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.356322 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" containerID="cri-o://74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6" gracePeriod=30 Apr 22 20:54:53.475263 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.475213 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f"] Apr 22 20:54:53.479276 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.479250 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:53.481126 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.481104 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 20:54:53.485434 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.485410 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f"] Apr 22 20:54:53.529660 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.529624 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f\" (UID: \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:53.529837 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.529685 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f\" (UID: \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:53.631282 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.631176 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f\" (UID: \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:53.631282 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.631248 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f\" (UID: \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:53.631667 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.631644 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f\" (UID: \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:53.631928 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.631908 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f\" (UID: \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:53.792037 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.792004 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:53.917390 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.917357 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f"] Apr 22 20:54:53.920092 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:54:53.920060 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbda5ec9_b17e_4835_9adf_4b18d97f4ff7.slice/crio-6ddaa37434dbfaeb718bf7429509beb74eccb9e3a92b1dfe95a3af6baa989d57 WatchSource:0}: Error finding container 6ddaa37434dbfaeb718bf7429509beb74eccb9e3a92b1dfe95a3af6baa989d57: Status 404 returned error can't find the container with id 6ddaa37434dbfaeb718bf7429509beb74eccb9e3a92b1dfe95a3af6baa989d57 Apr 22 20:54:53.922377 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:53.922358 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:54:54.177178 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:54.177078 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" event={"ID":"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7","Type":"ContainerStarted","Data":"b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf"} Apr 22 20:54:54.177178 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:54.177122 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" event={"ID":"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7","Type":"ContainerStarted","Data":"6ddaa37434dbfaeb718bf7429509beb74eccb9e3a92b1dfe95a3af6baa989d57"} Apr 22 20:54:55.181823 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:55.181787 2583 generic.go:358] "Generic (PLEG): container finished" podID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerID="b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf" exitCode=0 Apr 22 20:54:55.182236 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:55.181831 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" event={"ID":"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7","Type":"ContainerDied","Data":"b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf"} Apr 22 20:54:56.186489 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:56.186451 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" event={"ID":"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7","Type":"ContainerStarted","Data":"d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b"} Apr 22 20:54:56.186945 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:56.186661 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:54:56.188191 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:56.188163 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:54:56.202849 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:56.202804 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podStartSLOduration=3.202790788 podStartE2EDuration="3.202790788s" podCreationTimestamp="2026-04-22 20:54:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:54:56.201203032 +0000 UTC m=+3384.621109822" watchObservedRunningTime="2026-04-22 20:54:56.202790788 +0000 UTC m=+3384.622697578" Apr 22 20:54:57.190370 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:57.190330 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:54:57.909942 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:57.909916 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:54:57.972337 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:57.972257 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba06dbc0-da35-4a43-bcd8-60c92f4db382-kserve-provision-location\") pod \"ba06dbc0-da35-4a43-bcd8-60c92f4db382\" (UID: \"ba06dbc0-da35-4a43-bcd8-60c92f4db382\") " Apr 22 20:54:57.972571 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:57.972549 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba06dbc0-da35-4a43-bcd8-60c92f4db382-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ba06dbc0-da35-4a43-bcd8-60c92f4db382" (UID: "ba06dbc0-da35-4a43-bcd8-60c92f4db382"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:54:58.073639 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.073606 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba06dbc0-da35-4a43-bcd8-60c92f4db382-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:54:58.194776 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.194740 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerID="74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6" exitCode=0 Apr 22 20:54:58.195277 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.194813 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" event={"ID":"ba06dbc0-da35-4a43-bcd8-60c92f4db382","Type":"ContainerDied","Data":"74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6"} Apr 22 20:54:58.195277 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.194829 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" Apr 22 20:54:58.195277 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.194844 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5" event={"ID":"ba06dbc0-da35-4a43-bcd8-60c92f4db382","Type":"ContainerDied","Data":"f492b68c6a1bbf75633da622d360aedb7f93feacbca2dfe0acefb2470e49d3b8"} Apr 22 20:54:58.195277 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.194882 2583 scope.go:117] "RemoveContainer" containerID="74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6" Apr 22 20:54:58.203224 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.203205 2583 scope.go:117] "RemoveContainer" containerID="e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840" Apr 22 20:54:58.210981 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.210954 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5"] Apr 22 20:54:58.212445 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.212423 2583 scope.go:117] "RemoveContainer" containerID="74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6" Apr 22 20:54:58.212762 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:54:58.212736 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6\": container with ID starting with 74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6 not found: ID does not exist" containerID="74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6" Apr 22 20:54:58.212880 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.212767 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6"} err="failed to get container status \"74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6\": rpc error: code = NotFound desc = could not find container \"74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6\": container with ID starting with 74ed8faf31ed42ea018b07740fcab26f2c7bdef9e1c71dfaf740a5fdb52cf6f6 not found: ID does not exist" Apr 22 20:54:58.212880 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.212790 2583 scope.go:117] "RemoveContainer" containerID="e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840" Apr 22 20:54:58.213194 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:54:58.213170 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840\": container with ID starting with e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840 not found: ID does not exist" containerID="e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840" Apr 22 20:54:58.213263 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.213207 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840"} err="failed to get container status \"e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840\": rpc error: code = NotFound desc = could not find container \"e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840\": container with ID starting with e397af6789a0c9ff3a418be9c08206386882c74fa41012e42884386827ff5840 not found: ID does not exist" Apr 22 20:54:58.214804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:54:58.214784 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-66b4b49586-kxqw5"] Apr 22 20:55:00.098072 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:55:00.098036 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" path="/var/lib/kubelet/pods/ba06dbc0-da35-4a43-bcd8-60c92f4db382/volumes" Apr 22 20:55:07.190919 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:55:07.190842 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:55:17.191298 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:55:17.191244 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:55:27.190748 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:55:27.190660 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:55:37.191266 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:55:37.191212 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:55:47.190542 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:55:47.190497 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:55:57.190410 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:55:57.190369 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:56:07.191629 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:07.191591 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:56:13.513595 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:13.513565 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f"] Apr 22 20:56:13.514156 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:13.513856 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" containerID="cri-o://d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b" gracePeriod=30 Apr 22 20:56:14.587996 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.587956 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l"] Apr 22 20:56:14.588532 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.588511 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="storage-initializer" Apr 22 20:56:14.588618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.588535 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="storage-initializer" Apr 22 20:56:14.588618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.588550 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" Apr 22 20:56:14.588618 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.588559 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" Apr 22 20:56:14.588775 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.588649 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba06dbc0-da35-4a43-bcd8-60c92f4db382" containerName="kserve-container" Apr 22 20:56:14.591961 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.591938 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" Apr 22 20:56:14.598929 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.598902 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l"] Apr 22 20:56:14.751068 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.751029 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50edc68-9452-4d95-8794-c2ee632337ae-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l\" (UID: \"d50edc68-9452-4d95-8794-c2ee632337ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" Apr 22 20:56:14.852025 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.851929 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50edc68-9452-4d95-8794-c2ee632337ae-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l\" (UID: \"d50edc68-9452-4d95-8794-c2ee632337ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" Apr 22 20:56:14.852328 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.852306 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50edc68-9452-4d95-8794-c2ee632337ae-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l\" (UID: \"d50edc68-9452-4d95-8794-c2ee632337ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" Apr 22 20:56:14.904012 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:14.903971 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" Apr 22 20:56:15.031225 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:15.031200 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l"] Apr 22 20:56:15.033798 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:56:15.033753 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd50edc68_9452_4d95_8794_c2ee632337ae.slice/crio-97126c114ebdacc2702448f880b43c4356ea55070a65e60b4b1db22c1e6fb78a WatchSource:0}: Error finding container 97126c114ebdacc2702448f880b43c4356ea55070a65e60b4b1db22c1e6fb78a: Status 404 returned error can't find the container with id 97126c114ebdacc2702448f880b43c4356ea55070a65e60b4b1db22c1e6fb78a Apr 22 20:56:15.477477 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:15.477438 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" event={"ID":"d50edc68-9452-4d95-8794-c2ee632337ae","Type":"ContainerStarted","Data":"4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7"} Apr 22 20:56:15.477477 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:15.477475 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" event={"ID":"d50edc68-9452-4d95-8794-c2ee632337ae","Type":"ContainerStarted","Data":"97126c114ebdacc2702448f880b43c4356ea55070a65e60b4b1db22c1e6fb78a"} Apr 22 20:56:17.191003 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:17.190951 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 22 20:56:17.859332 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:17.859309 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:56:17.877836 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:17.877812 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-kserve-provision-location\") pod \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\" (UID: \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\") " Apr 22 20:56:17.878000 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:17.877912 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-cabundle-cert\") pod \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\" (UID: \"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7\") " Apr 22 20:56:17.878199 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:17.878123 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" (UID: "dbda5ec9-b17e-4835-9adf-4b18d97f4ff7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:56:17.878304 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:17.878280 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" (UID: "dbda5ec9-b17e-4835-9adf-4b18d97f4ff7"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:56:17.978448 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:17.978421 2583 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-cabundle-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:56:17.978448 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:17.978449 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:56:18.489318 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.489280 2583 generic.go:358] "Generic (PLEG): container finished" podID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerID="d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b" exitCode=0 Apr 22 20:56:18.489759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.489361 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" event={"ID":"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7","Type":"ContainerDied","Data":"d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b"} Apr 22 20:56:18.489759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.489374 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" Apr 22 20:56:18.489759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.489403 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f" event={"ID":"dbda5ec9-b17e-4835-9adf-4b18d97f4ff7","Type":"ContainerDied","Data":"6ddaa37434dbfaeb718bf7429509beb74eccb9e3a92b1dfe95a3af6baa989d57"} Apr 22 20:56:18.489759 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.489419 2583 scope.go:117] "RemoveContainer" containerID="d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b" Apr 22 20:56:18.497844 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.497826 2583 scope.go:117] "RemoveContainer" containerID="b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf" Apr 22 20:56:18.503822 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.503796 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f"] Apr 22 20:56:18.506307 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.506289 2583 scope.go:117] "RemoveContainer" containerID="d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b" Apr 22 20:56:18.506620 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:56:18.506600 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b\": container with ID starting with d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b not found: ID does not exist" containerID="d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b" Apr 22 20:56:18.506725 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.506627 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b"} err="failed to get container status \"d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b\": rpc error: code = NotFound desc = could not find container \"d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b\": container with ID starting with d3ec822ffe714fa65203acab2c043f6b29939e01dff64ee5e818d2230be70f2b not found: ID does not exist" Apr 22 20:56:18.506725 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.506645 2583 scope.go:117] "RemoveContainer" containerID="b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf" Apr 22 20:56:18.506973 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:56:18.506946 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf\": container with ID starting with b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf not found: ID does not exist" containerID="b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf" Apr 22 20:56:18.507051 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.506978 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf"} err="failed to get container status \"b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf\": rpc error: code = NotFound desc = could not find container \"b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf\": container with ID starting with b6a439b8de0df38be8e042de2f019fc0668c050c0ff47afd3b957d8a174aabaf not found: ID does not exist" Apr 22 20:56:18.507848 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:18.507826 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-574fdcdcd6-k755f"] Apr 22 20:56:20.097492 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:20.097406 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" path="/var/lib/kubelet/pods/dbda5ec9-b17e-4835-9adf-4b18d97f4ff7/volumes" Apr 22 20:56:20.498607 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:20.498576 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l_d50edc68-9452-4d95-8794-c2ee632337ae/storage-initializer/0.log" Apr 22 20:56:20.498785 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:20.498614 2583 generic.go:358] "Generic (PLEG): container finished" podID="d50edc68-9452-4d95-8794-c2ee632337ae" containerID="4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7" exitCode=1 Apr 22 20:56:20.498785 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:20.498646 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" event={"ID":"d50edc68-9452-4d95-8794-c2ee632337ae","Type":"ContainerDied","Data":"4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7"} Apr 22 20:56:21.504678 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:21.504648 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l_d50edc68-9452-4d95-8794-c2ee632337ae/storage-initializer/0.log" Apr 22 20:56:21.505073 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:21.504733 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" event={"ID":"d50edc68-9452-4d95-8794-c2ee632337ae","Type":"ContainerStarted","Data":"1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8"} Apr 22 20:56:24.572888 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:24.572840 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l"] Apr 22 20:56:24.573397 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:24.573161 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" podUID="d50edc68-9452-4d95-8794-c2ee632337ae" containerName="storage-initializer" containerID="cri-o://1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8" gracePeriod=30 Apr 22 20:56:25.644876 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.644837 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m"] Apr 22 20:56:25.645369 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.645351 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="storage-initializer" Apr 22 20:56:25.645430 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.645373 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="storage-initializer" Apr 22 20:56:25.645430 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.645400 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" Apr 22 20:56:25.645430 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.645406 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" Apr 22 20:56:25.645545 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.645472 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbda5ec9-b17e-4835-9adf-4b18d97f4ff7" containerName="kserve-container" Apr 22 20:56:25.648946 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.648927 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:25.650712 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.650693 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 20:56:25.657642 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.657612 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m"] Apr 22 20:56:25.732757 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.732719 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m\" (UID: \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:25.732980 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.732796 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m\" (UID: \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:25.833339 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.833303 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m\" (UID: \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:25.833524 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.833360 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m\" (UID: \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:25.833689 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.833666 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m\" (UID: \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:25.833944 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.833928 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m\" (UID: \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:25.960345 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:25.960312 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:26.086231 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.086209 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m"] Apr 22 20:56:26.089105 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:56:26.089075 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fcc3a2f_cb9a_462a_af7a_ab57afe02953.slice/crio-8baf3a513b09127cf14c0c0fdcf649db3a28a7b8315424ffc9582ff1a8354235 WatchSource:0}: Error finding container 8baf3a513b09127cf14c0c0fdcf649db3a28a7b8315424ffc9582ff1a8354235: Status 404 returned error can't find the container with id 8baf3a513b09127cf14c0c0fdcf649db3a28a7b8315424ffc9582ff1a8354235 Apr 22 20:56:26.516519 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.516496 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l_d50edc68-9452-4d95-8794-c2ee632337ae/storage-initializer/1.log" Apr 22 20:56:26.516892 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.516856 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l_d50edc68-9452-4d95-8794-c2ee632337ae/storage-initializer/0.log" Apr 22 20:56:26.517003 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.516950 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" Apr 22 20:56:26.524425 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.524402 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" event={"ID":"0fcc3a2f-cb9a-462a-af7a-ab57afe02953","Type":"ContainerStarted","Data":"9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8"} Apr 22 20:56:26.524545 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.524439 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" event={"ID":"0fcc3a2f-cb9a-462a-af7a-ab57afe02953","Type":"ContainerStarted","Data":"8baf3a513b09127cf14c0c0fdcf649db3a28a7b8315424ffc9582ff1a8354235"} Apr 22 20:56:26.525504 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.525487 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l_d50edc68-9452-4d95-8794-c2ee632337ae/storage-initializer/1.log" Apr 22 20:56:26.525837 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.525818 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l_d50edc68-9452-4d95-8794-c2ee632337ae/storage-initializer/0.log" Apr 22 20:56:26.525911 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.525895 2583 generic.go:358] "Generic (PLEG): container finished" podID="d50edc68-9452-4d95-8794-c2ee632337ae" containerID="1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8" exitCode=1 Apr 22 20:56:26.525953 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.525935 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" Apr 22 20:56:26.525994 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.525960 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" event={"ID":"d50edc68-9452-4d95-8794-c2ee632337ae","Type":"ContainerDied","Data":"1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8"} Apr 22 20:56:26.525994 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.525988 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l" event={"ID":"d50edc68-9452-4d95-8794-c2ee632337ae","Type":"ContainerDied","Data":"97126c114ebdacc2702448f880b43c4356ea55070a65e60b4b1db22c1e6fb78a"} Apr 22 20:56:26.526069 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.526005 2583 scope.go:117] "RemoveContainer" containerID="1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8" Apr 22 20:56:26.534402 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.534383 2583 scope.go:117] "RemoveContainer" containerID="4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7" Apr 22 20:56:26.542285 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.542267 2583 scope.go:117] "RemoveContainer" containerID="1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8" Apr 22 20:56:26.542558 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:56:26.542541 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8\": container with ID starting with 1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8 not found: ID does not exist" containerID="1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8" Apr 22 20:56:26.542604 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.542567 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8"} err="failed to get container status \"1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8\": rpc error: code = NotFound desc = could not find container \"1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8\": container with ID starting with 1df8159b842a0df698b13f4b0b6d97c62dd8c27b20eaddfe71e39095458af1b8 not found: ID does not exist" Apr 22 20:56:26.542604 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.542585 2583 scope.go:117] "RemoveContainer" containerID="4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7" Apr 22 20:56:26.542816 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:56:26.542792 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7\": container with ID starting with 4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7 not found: ID does not exist" containerID="4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7" Apr 22 20:56:26.542959 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.542829 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7"} err="failed to get container status \"4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7\": rpc error: code = NotFound desc = could not find container \"4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7\": container with ID starting with 4a65d2901eeeabf59aa68286dd77107890f05b498410d28f54cc282c4d075bd7 not found: ID does not exist" Apr 22 20:56:26.639480 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.639400 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50edc68-9452-4d95-8794-c2ee632337ae-kserve-provision-location\") pod \"d50edc68-9452-4d95-8794-c2ee632337ae\" (UID: \"d50edc68-9452-4d95-8794-c2ee632337ae\") " Apr 22 20:56:26.639661 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.639634 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50edc68-9452-4d95-8794-c2ee632337ae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d50edc68-9452-4d95-8794-c2ee632337ae" (UID: "d50edc68-9452-4d95-8794-c2ee632337ae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:56:26.739945 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.739912 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d50edc68-9452-4d95-8794-c2ee632337ae-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:56:26.856957 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.856927 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l"] Apr 22 20:56:26.860945 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:26.860917 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-85fd99d856-jbr8l"] Apr 22 20:56:27.531194 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:27.531159 2583 generic.go:358] "Generic (PLEG): container finished" podID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerID="9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8" exitCode=0 Apr 22 20:56:27.531405 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:27.531236 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" event={"ID":"0fcc3a2f-cb9a-462a-af7a-ab57afe02953","Type":"ContainerDied","Data":"9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8"} Apr 22 20:56:28.096684 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:28.096649 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d50edc68-9452-4d95-8794-c2ee632337ae" path="/var/lib/kubelet/pods/d50edc68-9452-4d95-8794-c2ee632337ae/volumes" Apr 22 20:56:28.537016 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:28.536975 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" event={"ID":"0fcc3a2f-cb9a-462a-af7a-ab57afe02953","Type":"ContainerStarted","Data":"5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1"} Apr 22 20:56:28.537241 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:28.537219 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:56:28.538609 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:28.538581 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:56:28.551950 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:28.551895 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podStartSLOduration=3.551851014 podStartE2EDuration="3.551851014s" podCreationTimestamp="2026-04-22 20:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:56:28.551434758 +0000 UTC m=+3476.971341550" watchObservedRunningTime="2026-04-22 20:56:28.551851014 +0000 UTC m=+3476.971757806" Apr 22 20:56:29.541122 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:29.541075 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:56:39.541995 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:39.541950 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:56:49.541494 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:49.541454 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:56:59.541658 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:56:59.541561 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:57:09.541989 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:09.541946 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:57:19.541625 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:19.541578 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:57:29.541412 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:29.541363 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:57:39.542036 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:39.542007 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:57:45.676213 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:45.676170 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m"] Apr 22 20:57:45.676796 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:45.676455 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" containerID="cri-o://5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1" gracePeriod=30 Apr 22 20:57:46.762178 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.762138 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j"] Apr 22 20:57:46.762727 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.762706 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d50edc68-9452-4d95-8794-c2ee632337ae" containerName="storage-initializer" Apr 22 20:57:46.762816 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.762729 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50edc68-9452-4d95-8794-c2ee632337ae" containerName="storage-initializer" Apr 22 20:57:46.762897 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.762878 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="d50edc68-9452-4d95-8794-c2ee632337ae" containerName="storage-initializer" Apr 22 20:57:46.763005 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.762991 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d50edc68-9452-4d95-8794-c2ee632337ae" containerName="storage-initializer" Apr 22 20:57:46.763073 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.763006 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50edc68-9452-4d95-8794-c2ee632337ae" containerName="storage-initializer" Apr 22 20:57:46.763128 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.763086 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="d50edc68-9452-4d95-8794-c2ee632337ae" containerName="storage-initializer" Apr 22 20:57:46.766326 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.766304 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" Apr 22 20:57:46.771961 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.771929 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j"] Apr 22 20:57:46.827949 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.827913 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c03c765-9849-437c-8895-440419a6974f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j\" (UID: \"1c03c765-9849-437c-8895-440419a6974f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" Apr 22 20:57:46.928590 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.928555 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c03c765-9849-437c-8895-440419a6974f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j\" (UID: \"1c03c765-9849-437c-8895-440419a6974f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" Apr 22 20:57:46.928958 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:46.928939 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c03c765-9849-437c-8895-440419a6974f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j\" (UID: \"1c03c765-9849-437c-8895-440419a6974f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" Apr 22 20:57:47.078488 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:47.078401 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" Apr 22 20:57:47.204377 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:47.204353 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j"] Apr 22 20:57:47.206642 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:57:47.206611 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c03c765_9849_437c_8895_440419a6974f.slice/crio-1a64136d8b7f489c8604d5a4bd27f52ed083ea6f8a1930741536da8c660ad7ee WatchSource:0}: Error finding container 1a64136d8b7f489c8604d5a4bd27f52ed083ea6f8a1930741536da8c660ad7ee: Status 404 returned error can't find the container with id 1a64136d8b7f489c8604d5a4bd27f52ed083ea6f8a1930741536da8c660ad7ee Apr 22 20:57:47.820443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:47.820402 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" event={"ID":"1c03c765-9849-437c-8895-440419a6974f","Type":"ContainerStarted","Data":"68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73"} Apr 22 20:57:47.820443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:47.820442 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" event={"ID":"1c03c765-9849-437c-8895-440419a6974f","Type":"ContainerStarted","Data":"1a64136d8b7f489c8604d5a4bd27f52ed083ea6f8a1930741536da8c660ad7ee"} Apr 22 20:57:49.542017 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:49.541972 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 22 20:57:50.225352 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.225319 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:57:50.361904 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.361848 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-kserve-provision-location\") pod \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\" (UID: \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\") " Apr 22 20:57:50.362085 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.361921 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-cabundle-cert\") pod \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\" (UID: \"0fcc3a2f-cb9a-462a-af7a-ab57afe02953\") " Apr 22 20:57:50.362225 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.362199 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0fcc3a2f-cb9a-462a-af7a-ab57afe02953" (UID: "0fcc3a2f-cb9a-462a-af7a-ab57afe02953"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:57:50.362265 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.362230 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "0fcc3a2f-cb9a-462a-af7a-ab57afe02953" (UID: "0fcc3a2f-cb9a-462a-af7a-ab57afe02953"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:57:50.463113 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.463080 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:57:50.463113 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.463111 2583 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0fcc3a2f-cb9a-462a-af7a-ab57afe02953-cabundle-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:57:50.833088 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.833057 2583 generic.go:358] "Generic (PLEG): container finished" podID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerID="5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1" exitCode=0 Apr 22 20:57:50.833556 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.833136 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" Apr 22 20:57:50.833556 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.833130 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" event={"ID":"0fcc3a2f-cb9a-462a-af7a-ab57afe02953","Type":"ContainerDied","Data":"5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1"} Apr 22 20:57:50.833556 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.833219 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m" event={"ID":"0fcc3a2f-cb9a-462a-af7a-ab57afe02953","Type":"ContainerDied","Data":"8baf3a513b09127cf14c0c0fdcf649db3a28a7b8315424ffc9582ff1a8354235"} Apr 22 20:57:50.833556 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.833243 2583 scope.go:117] "RemoveContainer" containerID="5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1" Apr 22 20:57:50.834730 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.834700 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_1c03c765-9849-437c-8895-440419a6974f/storage-initializer/0.log" Apr 22 20:57:50.834849 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.834740 2583 generic.go:358] "Generic (PLEG): container finished" podID="1c03c765-9849-437c-8895-440419a6974f" containerID="68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73" exitCode=1 Apr 22 20:57:50.834849 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.834816 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" event={"ID":"1c03c765-9849-437c-8895-440419a6974f","Type":"ContainerDied","Data":"68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73"} Apr 22 20:57:50.842163 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.842143 2583 scope.go:117] "RemoveContainer" containerID="9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8" Apr 22 20:57:50.849617 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.849599 2583 scope.go:117] "RemoveContainer" containerID="5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1" Apr 22 20:57:50.849918 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:57:50.849879 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1\": container with ID starting with 5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1 not found: ID does not exist" containerID="5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1" Apr 22 20:57:50.849982 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.849930 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1"} err="failed to get container status \"5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1\": rpc error: code = NotFound desc = could not find container \"5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1\": container with ID starting with 5a9f72c514e804bcb546c368bb7eeb2c0b07ba297656fe9810d0a81ab0b1bfd1 not found: ID does not exist" Apr 22 20:57:50.849982 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.849950 2583 scope.go:117] "RemoveContainer" containerID="9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8" Apr 22 20:57:50.850204 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:57:50.850186 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8\": container with ID starting with 9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8 not found: ID does not exist" containerID="9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8" Apr 22 20:57:50.850261 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.850214 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8"} err="failed to get container status \"9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8\": rpc error: code = NotFound desc = could not find container \"9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8\": container with ID starting with 9cd7c33fe8d9f6b0f96c410dfb5d6f351258fed652eee9734b4a1426e52a14c8 not found: ID does not exist" Apr 22 20:57:50.864182 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.864155 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m"] Apr 22 20:57:50.866225 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:50.866203 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-86cf7bff78-fvh7m"] Apr 22 20:57:51.840501 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:51.840476 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_1c03c765-9849-437c-8895-440419a6974f/storage-initializer/0.log" Apr 22 20:57:51.840928 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:51.840526 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" event={"ID":"1c03c765-9849-437c-8895-440419a6974f","Type":"ContainerStarted","Data":"ac2a3c5720120290fa5f025fa0310b7cf2d0db67d13e492240d3f8fbe151baed"} Apr 22 20:57:52.096563 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:52.096479 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" path="/var/lib/kubelet/pods/0fcc3a2f-cb9a-462a-af7a-ab57afe02953/volumes" Apr 22 20:57:53.849432 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:53.849398 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_1c03c765-9849-437c-8895-440419a6974f/storage-initializer/1.log" Apr 22 20:57:53.849912 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:53.849737 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_1c03c765-9849-437c-8895-440419a6974f/storage-initializer/0.log" Apr 22 20:57:53.849912 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:53.849771 2583 generic.go:358] "Generic (PLEG): container finished" podID="1c03c765-9849-437c-8895-440419a6974f" containerID="ac2a3c5720120290fa5f025fa0310b7cf2d0db67d13e492240d3f8fbe151baed" exitCode=1 Apr 22 20:57:53.849912 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:53.849807 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" event={"ID":"1c03c765-9849-437c-8895-440419a6974f","Type":"ContainerDied","Data":"ac2a3c5720120290fa5f025fa0310b7cf2d0db67d13e492240d3f8fbe151baed"} Apr 22 20:57:53.849912 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:53.849836 2583 scope.go:117] "RemoveContainer" containerID="68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73" Apr 22 20:57:53.850454 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:53.850427 2583 scope.go:117] "RemoveContainer" containerID="68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73" Apr 22 20:57:53.861378 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:57:53.861336 2583 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_kserve-ci-e2e-test_1c03c765-9849-437c-8895-440419a6974f_0 in pod sandbox 1a64136d8b7f489c8604d5a4bd27f52ed083ea6f8a1930741536da8c660ad7ee from index: no such id: '68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73'" containerID="68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73" Apr 22 20:57:53.861497 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:53.861385 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_kserve-ci-e2e-test_1c03c765-9849-437c-8895-440419a6974f_0 in pod sandbox 1a64136d8b7f489c8604d5a4bd27f52ed083ea6f8a1930741536da8c660ad7ee from index: no such id: '68a4ac6272b320621bd26b75daa5cac24af6833ad15785325b94b72d79b9fe73'" Apr 22 20:57:53.861614 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:57:53.861597 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_kserve-ci-e2e-test(1c03c765-9849-437c-8895-440419a6974f)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" podUID="1c03c765-9849-437c-8895-440419a6974f" Apr 22 20:57:54.854951 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:54.854919 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_1c03c765-9849-437c-8895-440419a6974f/storage-initializer/1.log" Apr 22 20:57:56.748199 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:56.748165 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j"] Apr 22 20:57:56.882691 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:56.882667 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_1c03c765-9849-437c-8895-440419a6974f/storage-initializer/1.log" Apr 22 20:57:56.882810 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:56.882731 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" Apr 22 20:57:57.018747 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.018654 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c03c765-9849-437c-8895-440419a6974f-kserve-provision-location\") pod \"1c03c765-9849-437c-8895-440419a6974f\" (UID: \"1c03c765-9849-437c-8895-440419a6974f\") " Apr 22 20:57:57.019005 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.018985 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c03c765-9849-437c-8895-440419a6974f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1c03c765-9849-437c-8895-440419a6974f" (UID: "1c03c765-9849-437c-8895-440419a6974f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:57:57.120112 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.120081 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c03c765-9849-437c-8895-440419a6974f-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:57:57.803086 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803053 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58"] Apr 22 20:57:57.803512 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803497 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="storage-initializer" Apr 22 20:57:57.803512 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803513 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="storage-initializer" Apr 22 20:57:57.803594 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803531 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c03c765-9849-437c-8895-440419a6974f" containerName="storage-initializer" Apr 22 20:57:57.803594 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803539 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c03c765-9849-437c-8895-440419a6974f" containerName="storage-initializer" Apr 22 20:57:57.803594 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803554 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" Apr 22 20:57:57.803594 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803563 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" Apr 22 20:57:57.803594 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803577 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c03c765-9849-437c-8895-440419a6974f" containerName="storage-initializer" Apr 22 20:57:57.803594 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803587 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c03c765-9849-437c-8895-440419a6974f" containerName="storage-initializer" Apr 22 20:57:57.803774 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803662 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c03c765-9849-437c-8895-440419a6974f" containerName="storage-initializer" Apr 22 20:57:57.803774 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803672 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c03c765-9849-437c-8895-440419a6974f" containerName="storage-initializer" Apr 22 20:57:57.803774 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.803680 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fcc3a2f-cb9a-462a-af7a-ab57afe02953" containerName="kserve-container" Apr 22 20:57:57.808255 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.808234 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:57:57.810472 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.810446 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 20:57:57.812923 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.812901 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58"] Apr 22 20:57:57.867084 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.867056 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j_1c03c765-9849-437c-8895-440419a6974f/storage-initializer/1.log" Apr 22 20:57:57.867283 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.867186 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" Apr 22 20:57:57.867283 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.867182 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j" event={"ID":"1c03c765-9849-437c-8895-440419a6974f","Type":"ContainerDied","Data":"1a64136d8b7f489c8604d5a4bd27f52ed083ea6f8a1930741536da8c660ad7ee"} Apr 22 20:57:57.867403 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.867291 2583 scope.go:117] "RemoveContainer" containerID="ac2a3c5720120290fa5f025fa0310b7cf2d0db67d13e492240d3f8fbe151baed" Apr 22 20:57:57.896607 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.896577 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j"] Apr 22 20:57:57.900773 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.900745 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6c9c9cf6f6-7ml9j"] Apr 22 20:57:57.926940 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.926856 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04db1c44-28d8-453a-a2c4-0019ee3cd563-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58\" (UID: \"04db1c44-28d8-453a-a2c4-0019ee3cd563\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:57:57.927100 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:57.926987 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/04db1c44-28d8-453a-a2c4-0019ee3cd563-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58\" (UID: \"04db1c44-28d8-453a-a2c4-0019ee3cd563\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:57:58.028478 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.028430 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04db1c44-28d8-453a-a2c4-0019ee3cd563-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58\" (UID: \"04db1c44-28d8-453a-a2c4-0019ee3cd563\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:57:58.028645 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.028506 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/04db1c44-28d8-453a-a2c4-0019ee3cd563-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58\" (UID: \"04db1c44-28d8-453a-a2c4-0019ee3cd563\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:57:58.028844 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.028823 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04db1c44-28d8-453a-a2c4-0019ee3cd563-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58\" (UID: \"04db1c44-28d8-453a-a2c4-0019ee3cd563\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:57:58.029118 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.029101 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/04db1c44-28d8-453a-a2c4-0019ee3cd563-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58\" (UID: \"04db1c44-28d8-453a-a2c4-0019ee3cd563\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:57:58.096794 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.096715 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c03c765-9849-437c-8895-440419a6974f" path="/var/lib/kubelet/pods/1c03c765-9849-437c-8895-440419a6974f/volumes" Apr 22 20:57:58.120177 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.120153 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:57:58.246312 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.246281 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58"] Apr 22 20:57:58.249230 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:57:58.249204 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04db1c44_28d8_453a_a2c4_0019ee3cd563.slice/crio-e6f2191434dee1f23e85d58e10ab2fe4d7c5ced5fa30847e609e419dd382b090 WatchSource:0}: Error finding container e6f2191434dee1f23e85d58e10ab2fe4d7c5ced5fa30847e609e419dd382b090: Status 404 returned error can't find the container with id e6f2191434dee1f23e85d58e10ab2fe4d7c5ced5fa30847e609e419dd382b090 Apr 22 20:57:58.873882 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.873817 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" event={"ID":"04db1c44-28d8-453a-a2c4-0019ee3cd563","Type":"ContainerStarted","Data":"e19bdec6327be57ffc765e668bd25aa368aae62c154d8b80bf81f611dd92d86b"} Apr 22 20:57:58.873882 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:58.873872 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" event={"ID":"04db1c44-28d8-453a-a2c4-0019ee3cd563","Type":"ContainerStarted","Data":"e6f2191434dee1f23e85d58e10ab2fe4d7c5ced5fa30847e609e419dd382b090"} Apr 22 20:57:59.879349 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:59.879317 2583 generic.go:358] "Generic (PLEG): container finished" podID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerID="e19bdec6327be57ffc765e668bd25aa368aae62c154d8b80bf81f611dd92d86b" exitCode=0 Apr 22 20:57:59.879749 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:57:59.879364 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" event={"ID":"04db1c44-28d8-453a-a2c4-0019ee3cd563","Type":"ContainerDied","Data":"e19bdec6327be57ffc765e668bd25aa368aae62c154d8b80bf81f611dd92d86b"} Apr 22 20:58:00.884339 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:00.884300 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" event={"ID":"04db1c44-28d8-453a-a2c4-0019ee3cd563","Type":"ContainerStarted","Data":"96ab02a033941691d138ed4e4ed0dd03a8065f7432b8cc3cfa0e72b5d4121674"} Apr 22 20:58:00.884785 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:00.884489 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:58:00.885793 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:00.885767 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:58:00.900516 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:00.900465 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podStartSLOduration=3.900452145 podStartE2EDuration="3.900452145s" podCreationTimestamp="2026-04-22 20:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:58:00.898574701 +0000 UTC m=+3569.318481491" watchObservedRunningTime="2026-04-22 20:58:00.900452145 +0000 UTC m=+3569.320358935" Apr 22 20:58:01.888393 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:01.888352 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:58:11.888556 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:11.888507 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:58:21.889350 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:21.889256 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:58:31.888365 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:31.888307 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:58:41.888725 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:41.888673 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:58:51.888650 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:58:51.888601 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:59:01.888354 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:01.888304 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:59:11.889683 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:11.889653 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:59:17.843151 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:17.843120 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58"] Apr 22 20:59:17.843572 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:17.843374 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" containerID="cri-o://96ab02a033941691d138ed4e4ed0dd03a8065f7432b8cc3cfa0e72b5d4121674" gracePeriod=30 Apr 22 20:59:18.908605 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:18.908572 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq"] Apr 22 20:59:18.912708 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:18.912684 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" Apr 22 20:59:18.920401 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:18.919950 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq"] Apr 22 20:59:19.010587 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:19.010555 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/341bc242-1be1-4237-8e5c-6febd23f7c59-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq\" (UID: \"341bc242-1be1-4237-8e5c-6febd23f7c59\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" Apr 22 20:59:19.111416 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:19.111376 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/341bc242-1be1-4237-8e5c-6febd23f7c59-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq\" (UID: \"341bc242-1be1-4237-8e5c-6febd23f7c59\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" Apr 22 20:59:19.111804 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:19.111783 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/341bc242-1be1-4237-8e5c-6febd23f7c59-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq\" (UID: \"341bc242-1be1-4237-8e5c-6febd23f7c59\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" Apr 22 20:59:19.225131 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:19.225096 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" Apr 22 20:59:19.351988 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:19.351959 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq"] Apr 22 20:59:20.158114 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:20.158078 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" event={"ID":"341bc242-1be1-4237-8e5c-6febd23f7c59","Type":"ContainerStarted","Data":"cb2b44b22d51d180bb78b8794c997c2936ba70d89a918d99c1c0745e3c41136a"} Apr 22 20:59:20.158114 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:20.158121 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" event={"ID":"341bc242-1be1-4237-8e5c-6febd23f7c59","Type":"ContainerStarted","Data":"8df0d58b15b3f8bc3c74a0ec7d408c3c9b2553e60109e4321f615222e2d3df5b"} Apr 22 20:59:21.888697 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:21.888653 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 22 20:59:22.166842 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.166816 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_341bc242-1be1-4237-8e5c-6febd23f7c59/storage-initializer/0.log" Apr 22 20:59:22.167069 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.166853 2583 generic.go:358] "Generic (PLEG): container finished" podID="341bc242-1be1-4237-8e5c-6febd23f7c59" containerID="cb2b44b22d51d180bb78b8794c997c2936ba70d89a918d99c1c0745e3c41136a" exitCode=1 Apr 22 20:59:22.167069 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.166909 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" event={"ID":"341bc242-1be1-4237-8e5c-6febd23f7c59","Type":"ContainerDied","Data":"cb2b44b22d51d180bb78b8794c997c2936ba70d89a918d99c1c0745e3c41136a"} Apr 22 20:59:22.168895 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.168850 2583 generic.go:358] "Generic (PLEG): container finished" podID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerID="96ab02a033941691d138ed4e4ed0dd03a8065f7432b8cc3cfa0e72b5d4121674" exitCode=0 Apr 22 20:59:22.169013 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.168901 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" event={"ID":"04db1c44-28d8-453a-a2c4-0019ee3cd563","Type":"ContainerDied","Data":"96ab02a033941691d138ed4e4ed0dd03a8065f7432b8cc3cfa0e72b5d4121674"} Apr 22 20:59:22.303359 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.303334 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:59:22.337489 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.337455 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04db1c44-28d8-453a-a2c4-0019ee3cd563-kserve-provision-location\") pod \"04db1c44-28d8-453a-a2c4-0019ee3cd563\" (UID: \"04db1c44-28d8-453a-a2c4-0019ee3cd563\") " Apr 22 20:59:22.337679 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.337521 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/04db1c44-28d8-453a-a2c4-0019ee3cd563-cabundle-cert\") pod \"04db1c44-28d8-453a-a2c4-0019ee3cd563\" (UID: \"04db1c44-28d8-453a-a2c4-0019ee3cd563\") " Apr 22 20:59:22.337877 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.337834 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04db1c44-28d8-453a-a2c4-0019ee3cd563-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "04db1c44-28d8-453a-a2c4-0019ee3cd563" (UID: "04db1c44-28d8-453a-a2c4-0019ee3cd563"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:59:22.337958 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.337843 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04db1c44-28d8-453a-a2c4-0019ee3cd563-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "04db1c44-28d8-453a-a2c4-0019ee3cd563" (UID: "04db1c44-28d8-453a-a2c4-0019ee3cd563"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:59:22.438312 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.438278 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04db1c44-28d8-453a-a2c4-0019ee3cd563-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:59:22.438312 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:22.438306 2583 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/04db1c44-28d8-453a-a2c4-0019ee3cd563-cabundle-cert\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:59:23.174443 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:23.174415 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_341bc242-1be1-4237-8e5c-6febd23f7c59/storage-initializer/0.log" Apr 22 20:59:23.174971 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:23.174496 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" event={"ID":"341bc242-1be1-4237-8e5c-6febd23f7c59","Type":"ContainerStarted","Data":"8c1a95175ebaadd73d67d8b041d01a0750de2e1aa7ce3e3d9e77215c84a1eb35"} Apr 22 20:59:23.175952 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:23.175932 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" Apr 22 20:59:23.175952 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:23.175945 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58" event={"ID":"04db1c44-28d8-453a-a2c4-0019ee3cd563","Type":"ContainerDied","Data":"e6f2191434dee1f23e85d58e10ab2fe4d7c5ced5fa30847e609e419dd382b090"} Apr 22 20:59:23.176113 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:23.175986 2583 scope.go:117] "RemoveContainer" containerID="96ab02a033941691d138ed4e4ed0dd03a8065f7432b8cc3cfa0e72b5d4121674" Apr 22 20:59:23.184541 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:23.184523 2583 scope.go:117] "RemoveContainer" containerID="e19bdec6327be57ffc765e668bd25aa368aae62c154d8b80bf81f611dd92d86b" Apr 22 20:59:23.201352 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:23.201327 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58"] Apr 22 20:59:23.205341 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:23.205319 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-777f8b9c4c-tfw58"] Apr 22 20:59:24.096392 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:24.096358 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" path="/var/lib/kubelet/pods/04db1c44-28d8-453a-a2c4-0019ee3cd563/volumes" Apr 22 20:59:26.194546 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:26.194515 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_341bc242-1be1-4237-8e5c-6febd23f7c59/storage-initializer/1.log" Apr 22 20:59:26.195028 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:26.194850 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_341bc242-1be1-4237-8e5c-6febd23f7c59/storage-initializer/0.log" Apr 22 20:59:26.195028 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:26.194912 2583 generic.go:358] "Generic (PLEG): container finished" podID="341bc242-1be1-4237-8e5c-6febd23f7c59" containerID="8c1a95175ebaadd73d67d8b041d01a0750de2e1aa7ce3e3d9e77215c84a1eb35" exitCode=1 Apr 22 20:59:26.195028 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:26.194981 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" event={"ID":"341bc242-1be1-4237-8e5c-6febd23f7c59","Type":"ContainerDied","Data":"8c1a95175ebaadd73d67d8b041d01a0750de2e1aa7ce3e3d9e77215c84a1eb35"} Apr 22 20:59:26.195028 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:26.195022 2583 scope.go:117] "RemoveContainer" containerID="cb2b44b22d51d180bb78b8794c997c2936ba70d89a918d99c1c0745e3c41136a" Apr 22 20:59:26.195411 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:26.195392 2583 scope.go:117] "RemoveContainer" containerID="cb2b44b22d51d180bb78b8794c997c2936ba70d89a918d99c1c0745e3c41136a" Apr 22 20:59:26.208131 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:59:26.208098 2583 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_kserve-ci-e2e-test_341bc242-1be1-4237-8e5c-6febd23f7c59_0 in pod sandbox 8df0d58b15b3f8bc3c74a0ec7d408c3c9b2553e60109e4321f615222e2d3df5b from index: no such id: 'cb2b44b22d51d180bb78b8794c997c2936ba70d89a918d99c1c0745e3c41136a'" containerID="cb2b44b22d51d180bb78b8794c997c2936ba70d89a918d99c1c0745e3c41136a" Apr 22 20:59:26.208239 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:59:26.208142 2583 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_kserve-ci-e2e-test_341bc242-1be1-4237-8e5c-6febd23f7c59_0 in pod sandbox 8df0d58b15b3f8bc3c74a0ec7d408c3c9b2553e60109e4321f615222e2d3df5b from index: no such id: 'cb2b44b22d51d180bb78b8794c997c2936ba70d89a918d99c1c0745e3c41136a'; Skipping pod \"isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_kserve-ci-e2e-test(341bc242-1be1-4237-8e5c-6febd23f7c59)\"" logger="UnhandledError" Apr 22 20:59:26.209608 ip-10-0-135-221 kubenswrapper[2583]: E0422 20:59:26.209527 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_kserve-ci-e2e-test(341bc242-1be1-4237-8e5c-6febd23f7c59)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" podUID="341bc242-1be1-4237-8e5c-6febd23f7c59" Apr 22 20:59:27.199471 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:27.199444 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_341bc242-1be1-4237-8e5c-6febd23f7c59/storage-initializer/1.log" Apr 22 20:59:28.901750 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:28.901702 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq"] Apr 22 20:59:29.044477 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.044454 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_341bc242-1be1-4237-8e5c-6febd23f7c59/storage-initializer/1.log" Apr 22 20:59:29.044607 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.044521 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" Apr 22 20:59:29.094959 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.094929 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/341bc242-1be1-4237-8e5c-6febd23f7c59-kserve-provision-location\") pod \"341bc242-1be1-4237-8e5c-6febd23f7c59\" (UID: \"341bc242-1be1-4237-8e5c-6febd23f7c59\") " Apr 22 20:59:29.095202 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.095177 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/341bc242-1be1-4237-8e5c-6febd23f7c59-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "341bc242-1be1-4237-8e5c-6febd23f7c59" (UID: "341bc242-1be1-4237-8e5c-6febd23f7c59"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:59:29.196275 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.196248 2583 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/341bc242-1be1-4237-8e5c-6febd23f7c59-kserve-provision-location\") on node \"ip-10-0-135-221.ec2.internal\" DevicePath \"\"" Apr 22 20:59:29.208144 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.208119 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq_341bc242-1be1-4237-8e5c-6febd23f7c59/storage-initializer/1.log" Apr 22 20:59:29.208306 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.208162 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" event={"ID":"341bc242-1be1-4237-8e5c-6febd23f7c59","Type":"ContainerDied","Data":"8df0d58b15b3f8bc3c74a0ec7d408c3c9b2553e60109e4321f615222e2d3df5b"} Apr 22 20:59:29.208306 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.208198 2583 scope.go:117] "RemoveContainer" containerID="8c1a95175ebaadd73d67d8b041d01a0750de2e1aa7ce3e3d9e77215c84a1eb35" Apr 22 20:59:29.208306 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.208221 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq" Apr 22 20:59:29.240073 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.240043 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq"] Apr 22 20:59:29.243368 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:29.243343 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-58d5679b65-mf2bq"] Apr 22 20:59:30.097156 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:30.097124 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="341bc242-1be1-4237-8e5c-6febd23f7c59" path="/var/lib/kubelet/pods/341bc242-1be1-4237-8e5c-6febd23f7c59/volumes" Apr 22 20:59:55.619102 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619021 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mrjvk/must-gather-ltfg6"] Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619433 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="341bc242-1be1-4237-8e5c-6febd23f7c59" containerName="storage-initializer" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619444 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="341bc242-1be1-4237-8e5c-6febd23f7c59" containerName="storage-initializer" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619455 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="341bc242-1be1-4237-8e5c-6febd23f7c59" containerName="storage-initializer" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619460 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="341bc242-1be1-4237-8e5c-6febd23f7c59" containerName="storage-initializer" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619476 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="storage-initializer" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619484 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="storage-initializer" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619494 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619499 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619551 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="341bc242-1be1-4237-8e5c-6febd23f7c59" containerName="storage-initializer" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619558 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="341bc242-1be1-4237-8e5c-6febd23f7c59" containerName="storage-initializer" Apr 22 20:59:55.619561 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.619566 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="04db1c44-28d8-453a-a2c4-0019ee3cd563" containerName="kserve-container" Apr 22 20:59:55.622592 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.622576 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mrjvk/must-gather-ltfg6" Apr 22 20:59:55.627473 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.627452 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mrjvk\"/\"default-dockercfg-l7rpd\"" Apr 22 20:59:55.627608 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.627549 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mrjvk\"/\"kube-root-ca.crt\"" Apr 22 20:59:55.627678 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.627611 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mrjvk\"/\"openshift-service-ca.crt\"" Apr 22 20:59:55.646673 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.646638 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mrjvk/must-gather-ltfg6"] Apr 22 20:59:55.736646 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.736608 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d0b76436-42de-47e5-8f61-04626e44f387-must-gather-output\") pod \"must-gather-ltfg6\" (UID: \"d0b76436-42de-47e5-8f61-04626e44f387\") " pod="openshift-must-gather-mrjvk/must-gather-ltfg6" Apr 22 20:59:55.736824 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.736663 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmv44\" (UniqueName: \"kubernetes.io/projected/d0b76436-42de-47e5-8f61-04626e44f387-kube-api-access-mmv44\") pod \"must-gather-ltfg6\" (UID: \"d0b76436-42de-47e5-8f61-04626e44f387\") " pod="openshift-must-gather-mrjvk/must-gather-ltfg6" Apr 22 20:59:55.838145 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.838107 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d0b76436-42de-47e5-8f61-04626e44f387-must-gather-output\") pod \"must-gather-ltfg6\" (UID: \"d0b76436-42de-47e5-8f61-04626e44f387\") " pod="openshift-must-gather-mrjvk/must-gather-ltfg6" Apr 22 20:59:55.838305 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.838163 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmv44\" (UniqueName: \"kubernetes.io/projected/d0b76436-42de-47e5-8f61-04626e44f387-kube-api-access-mmv44\") pod \"must-gather-ltfg6\" (UID: \"d0b76436-42de-47e5-8f61-04626e44f387\") " pod="openshift-must-gather-mrjvk/must-gather-ltfg6" Apr 22 20:59:55.838433 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.838415 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d0b76436-42de-47e5-8f61-04626e44f387-must-gather-output\") pod \"must-gather-ltfg6\" (UID: \"d0b76436-42de-47e5-8f61-04626e44f387\") " pod="openshift-must-gather-mrjvk/must-gather-ltfg6" Apr 22 20:59:55.846808 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.846783 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmv44\" (UniqueName: \"kubernetes.io/projected/d0b76436-42de-47e5-8f61-04626e44f387-kube-api-access-mmv44\") pod \"must-gather-ltfg6\" (UID: \"d0b76436-42de-47e5-8f61-04626e44f387\") " pod="openshift-must-gather-mrjvk/must-gather-ltfg6" Apr 22 20:59:55.947645 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:55.947624 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mrjvk/must-gather-ltfg6" Apr 22 20:59:56.067362 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:56.067341 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mrjvk/must-gather-ltfg6"] Apr 22 20:59:56.069514 ip-10-0-135-221 kubenswrapper[2583]: W0422 20:59:56.069489 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0b76436_42de_47e5_8f61_04626e44f387.slice/crio-cda3431f3d3712d8fe27429085eff8d7821cc5eb440e0434ed066655ffab4301 WatchSource:0}: Error finding container cda3431f3d3712d8fe27429085eff8d7821cc5eb440e0434ed066655ffab4301: Status 404 returned error can't find the container with id cda3431f3d3712d8fe27429085eff8d7821cc5eb440e0434ed066655ffab4301 Apr 22 20:59:56.071305 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:56.071290 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:59:56.305044 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:56.304958 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/must-gather-ltfg6" event={"ID":"d0b76436-42de-47e5-8f61-04626e44f387","Type":"ContainerStarted","Data":"cda3431f3d3712d8fe27429085eff8d7821cc5eb440e0434ed066655ffab4301"} Apr 22 20:59:57.310739 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:57.310647 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/must-gather-ltfg6" event={"ID":"d0b76436-42de-47e5-8f61-04626e44f387","Type":"ContainerStarted","Data":"4b1115aacef173acf3816f57143a81e9cc9b4f4075b35cab62c2e12df34f9649"} Apr 22 20:59:57.310739 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:57.310685 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/must-gather-ltfg6" event={"ID":"d0b76436-42de-47e5-8f61-04626e44f387","Type":"ContainerStarted","Data":"0608f363c8dc59f1d087c9097f9970a94381ee920da1d499e2f50038ccb746b5"} Apr 22 20:59:57.325523 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:57.325471 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mrjvk/must-gather-ltfg6" podStartSLOduration=1.438538464 podStartE2EDuration="2.325455286s" podCreationTimestamp="2026-04-22 20:59:55 +0000 UTC" firstStartedPulling="2026-04-22 20:59:56.071415922 +0000 UTC m=+3684.491322690" lastFinishedPulling="2026-04-22 20:59:56.95833273 +0000 UTC m=+3685.378239512" observedRunningTime="2026-04-22 20:59:57.324208789 +0000 UTC m=+3685.744115579" watchObservedRunningTime="2026-04-22 20:59:57.325455286 +0000 UTC m=+3685.745362076" Apr 22 20:59:58.404535 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:58.404505 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dp22p_d0611db5-4075-4c63-8413-0bc07ce6cf5d/global-pull-secret-syncer/0.log" Apr 22 20:59:58.609781 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:58.609732 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zsbfm_ab1e4670-372d-4a67-810d-77a48d25a47d/konnectivity-agent/0.log" Apr 22 20:59:58.653642 ip-10-0-135-221 kubenswrapper[2583]: I0422 20:59:58.653608 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-221.ec2.internal_3c3427913a40e2eb667593f4c197b1b3/haproxy/0.log" Apr 22 21:00:01.889328 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:01.889291 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5fc053ca-15dc-43bd-ba88-94d8d38038c6/alertmanager/0.log" Apr 22 21:00:01.913962 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:01.913925 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5fc053ca-15dc-43bd-ba88-94d8d38038c6/config-reloader/0.log" Apr 22 21:00:01.933129 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:01.933091 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5fc053ca-15dc-43bd-ba88-94d8d38038c6/kube-rbac-proxy-web/0.log" Apr 22 21:00:01.952987 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:01.952942 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5fc053ca-15dc-43bd-ba88-94d8d38038c6/kube-rbac-proxy/0.log" Apr 22 21:00:01.970949 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:01.970916 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5fc053ca-15dc-43bd-ba88-94d8d38038c6/kube-rbac-proxy-metric/0.log" Apr 22 21:00:01.994393 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:01.994354 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5fc053ca-15dc-43bd-ba88-94d8d38038c6/prom-label-proxy/0.log" Apr 22 21:00:02.013815 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.013777 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5fc053ca-15dc-43bd-ba88-94d8d38038c6/init-config-reloader/0.log" Apr 22 21:00:02.142730 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.142602 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-74c8dd4d84-45lt5_a135e0bd-d074-49d7-88ff-c4450222dcd8/metrics-server/0.log" Apr 22 21:00:02.163758 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.163725 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-nbqln_7fdabb89-cd2c-43cf-b2d4-6ff45dd2cad9/monitoring-plugin/0.log" Apr 22 21:00:02.189093 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.189059 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dkb7w_0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc/node-exporter/0.log" Apr 22 21:00:02.208510 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.208479 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dkb7w_0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc/kube-rbac-proxy/0.log" Apr 22 21:00:02.230646 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.230613 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dkb7w_0a45db1e-d88a-441f-9a6c-ce51d7d2cfbc/init-textfile/0.log" Apr 22 21:00:02.388136 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.388098 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lgpsc_2c71d6fd-4e94-4600-b17a-6b70abc22552/kube-rbac-proxy-main/0.log" Apr 22 21:00:02.407795 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.407713 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lgpsc_2c71d6fd-4e94-4600-b17a-6b70abc22552/kube-rbac-proxy-self/0.log" Apr 22 21:00:02.425788 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.425755 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lgpsc_2c71d6fd-4e94-4600-b17a-6b70abc22552/openshift-state-metrics/0.log" Apr 22 21:00:02.641362 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.641331 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-ls8ts_8f9f60bf-8244-42b4-942e-45f9cc8a9567/prometheus-operator-admission-webhook/0.log" Apr 22 21:00:02.667391 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.667260 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69d7b59889-r8587_6b512aec-107a-4638-b0fc-0230936d2e16/telemeter-client/0.log" Apr 22 21:00:02.684981 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.684950 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69d7b59889-r8587_6b512aec-107a-4638-b0fc-0230936d2e16/reload/0.log" Apr 22 21:00:02.704216 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:02.704180 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69d7b59889-r8587_6b512aec-107a-4638-b0fc-0230936d2e16/kube-rbac-proxy/0.log" Apr 22 21:00:04.797323 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:04.797286 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6fd456cd-mtx2p_6bd5a835-01f0-4b34-9121-a4ba62fa8ca5/console/0.log" Apr 22 21:00:04.842237 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:04.842203 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-78cpf_25a7c1e2-d706-4d4a-8ee5-197c8eec5993/download-server/0.log" Apr 22 21:00:05.221203 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:05.221175 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-htt2l_47e5d131-56cf-49a5-bc27-d784bcab468a/volume-data-source-validator/0.log" Apr 22 21:00:05.810013 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:05.809980 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2llws_7b46c2ec-f7aa-4451-90a4-5e3695b9ed78/dns/0.log" Apr 22 21:00:05.835826 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:05.835800 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2llws_7b46c2ec-f7aa-4451-90a4-5e3695b9ed78/kube-rbac-proxy/0.log" Apr 22 21:00:06.013326 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.013294 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w5slr_0cb9503d-e2e9-4f70-97aa-e8fa372598fc/dns-node-resolver/0.log" Apr 22 21:00:06.366946 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.366916 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x"] Apr 22 21:00:06.372067 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.372041 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.376223 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.376134 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x"] Apr 22 21:00:06.435655 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.435624 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-lib-modules\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.435844 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.435664 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzm6c\" (UniqueName: \"kubernetes.io/projected/ed5b2c6a-1bea-4081-88af-74406fe58433-kube-api-access-tzm6c\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.435844 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.435732 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-sys\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.435844 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.435760 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-proc\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.435844 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.435789 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-podres\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.519088 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.519055 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gvfnk_01431c0e-d992-47b0-b2db-613b46bfb3ba/node-ca/0.log" Apr 22 21:00:06.537030 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.536995 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-sys\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.537030 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.537041 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-proc\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.537287 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.537088 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-podres\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.537287 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.537138 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-sys\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.537287 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.537202 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-proc\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.537287 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.537238 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-lib-modules\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.537287 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.537277 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-podres\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.537287 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.537283 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzm6c\" (UniqueName: \"kubernetes.io/projected/ed5b2c6a-1bea-4081-88af-74406fe58433-kube-api-access-tzm6c\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.537605 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.537417 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5b2c6a-1bea-4081-88af-74406fe58433-lib-modules\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.544417 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.544384 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzm6c\" (UniqueName: \"kubernetes.io/projected/ed5b2c6a-1bea-4081-88af-74406fe58433-kube-api-access-tzm6c\") pod \"perf-node-gather-daemonset-2d88x\" (UID: \"ed5b2c6a-1bea-4081-88af-74406fe58433\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.686651 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.686615 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:06.844746 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:06.844445 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x"] Apr 22 21:00:07.354202 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:07.354105 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" event={"ID":"ed5b2c6a-1bea-4081-88af-74406fe58433","Type":"ContainerStarted","Data":"65982f845b1b74ff2ce3e9dd574bb9324c8209f32dcbbb47e4658b181eb447a3"} Apr 22 21:00:07.354202 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:07.354149 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" event={"ID":"ed5b2c6a-1bea-4081-88af-74406fe58433","Type":"ContainerStarted","Data":"96dc8e25d60b521dc5fb201866fa5e6af9264a1324a288b2ed51283ecdfd6188"} Apr 22 21:00:07.354404 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:07.354261 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:07.370477 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:07.370432 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" podStartSLOduration=1.370418463 podStartE2EDuration="1.370418463s" podCreationTimestamp="2026-04-22 21:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:00:07.368697915 +0000 UTC m=+3695.788604707" watchObservedRunningTime="2026-04-22 21:00:07.370418463 +0000 UTC m=+3695.790325255" Apr 22 21:00:13.370336 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:13.370302 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-2d88x" Apr 22 21:00:53.242037 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:53.242007 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d7dc7d6db-nq4vr_a7cb3649-fe15-4eeb-a0e1-e3b600a54358/router/0.log" Apr 22 21:00:53.545238 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:53.545154 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kwgk6_c938b1dd-fed3-4797-aab7-2136204f1cd8/serve-healthcheck-canary/0.log" Apr 22 21:00:53.909421 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:53.909336 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-br2nv_93becda5-df4b-41f0-954c-ed611504c70c/insights-operator/0.log" Apr 22 21:00:53.910770 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:53.910743 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-br2nv_93becda5-df4b-41f0-954c-ed611504c70c/insights-operator/1.log" Apr 22 21:00:53.928185 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:53.928156 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8wd72_9e0754b2-cef4-4b51-b452-93abecb53041/kube-rbac-proxy/0.log" Apr 22 21:00:53.945623 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:53.945600 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8wd72_9e0754b2-cef4-4b51-b452-93abecb53041/exporter/0.log" Apr 22 21:00:53.963540 ip-10-0-135-221 kubenswrapper[2583]: I0422 21:00:53.963509 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8wd72_9e0754b2-cef4-4b51-b452-93abecb53041/extractor/0.log"