Apr 24 16:38:51.044376 ip-10-0-128-44 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:38:51.444959 ip-10-0-128-44 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:51.444959 ip-10-0-128-44 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:38:51.444959 ip-10-0-128-44 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:51.444959 ip-10-0-128-44 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:38:51.444959 ip-10-0-128-44 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:51.447053 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.446962 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:38:51.449243 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449228 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:51.449243 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449243 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449247 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449251 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449254 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449257 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449260 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449262 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449265 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449268 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449271 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449273 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449282 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449285 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449288 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449291 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449293 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449296 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449298 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449301 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449303 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:51.449308 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449306 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449309 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449311 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449314 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449317 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449320 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449323 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449325 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449328 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449331 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449333 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449336 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449338 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449341 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449343 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449346 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449348 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449351 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449353 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:51.449778 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449357 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449361 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449364 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449366 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449370 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449374 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449377 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449381 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449383 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449386 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449388 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449391 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449393 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449396 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449398 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449401 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449404 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449407 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449409 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449412 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:51.450324 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449415 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449417 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449420 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449423 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449425 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449428 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449430 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449433 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449435 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449438 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449441 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449443 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449446 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449448 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449451 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449453 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449456 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449458 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449461 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449464 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:51.450812 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449466 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449469 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449472 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449474 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449477 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449481 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449888 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449895 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449898 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449901 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449903 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449906 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449909 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449911 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449914 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449916 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449919 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449922 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449924 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449927 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:51.451323 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449929 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449932 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449935 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449945 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449948 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449950 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449953 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449955 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449960 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449964 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449968 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449971 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449974 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449977 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449980 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449983 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449985 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449988 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449990 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449993 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:51.451807 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449996 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.449999 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450003 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450006 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450009 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450012 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450014 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450017 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450019 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450022 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450024 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450027 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450029 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450032 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450034 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450037 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450039 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450042 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450044 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:51.452360 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450047 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450049 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450053 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450055 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450059 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450061 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450064 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450067 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450069 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450071 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450074 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450076 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450080 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450083 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450086 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450088 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450091 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450093 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450096 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450098 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:51.452851 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450101 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450104 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450107 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450109 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450111 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450126 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450129 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450131 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450134 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450137 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450140 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450142 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.450145 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450762 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450770 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450776 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450780 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450786 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450789 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450794 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450798 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:38:51.453357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450802 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450805 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450809 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450813 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450816 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450819 2561 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450822 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450826 2561 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450829 2561 flags.go:64] FLAG: --cloud-config="" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450832 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450835 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450839 2561 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450842 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450845 2561 flags.go:64] FLAG: --config-dir="" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450848 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450851 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450855 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450858 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450861 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450865 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450869 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450872 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450875 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450878 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450881 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:38:51.453860 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450885 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450888 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450891 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450893 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450897 2561 flags.go:64] FLAG: --enable-server="true" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450900 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450904 2561 flags.go:64] FLAG: --event-burst="100" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450906 2561 flags.go:64] FLAG: --event-qps="50" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450909 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450912 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450915 2561 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450919 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450922 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450925 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450928 2561 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450931 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450934 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450937 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450943 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450946 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450949 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450952 2561 flags.go:64] FLAG: --feature-gates="" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450956 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450959 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450962 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:38:51.454486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450965 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450969 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450971 2561 flags.go:64] FLAG: --help="false" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450974 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450977 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450980 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450983 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450986 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450990 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450992 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450995 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.450998 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451001 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451004 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451008 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451010 2561 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451014 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451016 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451019 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451022 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451025 2561 flags.go:64] FLAG: --lock-file="" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451028 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451031 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451034 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:38:51.455076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451039 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451043 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451046 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451049 2561 flags.go:64] FLAG: --logging-format="text" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451053 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451056 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451059 2561 flags.go:64] FLAG: --manifest-url="" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451062 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451066 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451069 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451073 2561 flags.go:64] FLAG: --max-pods="110" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451076 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451079 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451082 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451085 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451088 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451091 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451094 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451101 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451104 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451108 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451112 2561 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451130 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:38:51.455693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451136 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451139 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451142 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451145 2561 flags.go:64] FLAG: --port="10250" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451148 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451151 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04419e7493d2cd512" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451155 2561 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451158 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451161 2561 flags.go:64] FLAG: --register-node="true" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451164 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451172 2561 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451175 2561 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451178 2561 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451182 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451185 2561 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451188 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451191 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451194 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451197 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451200 2561 flags.go:64] FLAG: --runonce="false" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451203 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451206 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451213 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451216 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451218 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451222 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:38:51.456373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451225 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451228 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451231 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451234 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451237 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451240 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451243 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451247 2561 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451250 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451255 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451258 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451261 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451265 2561 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451268 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451271 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451274 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451278 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451281 2561 flags.go:64] FLAG: --v="2" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451285 2561 flags.go:64] FLAG: --version="false" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451289 2561 flags.go:64] FLAG: --vmodule="" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451294 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.451297 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451389 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451394 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:51.456996 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451397 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451400 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451403 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451405 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451408 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451410 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451413 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451417 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451420 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451423 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451426 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451430 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451433 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451436 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451439 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451441 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451444 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451447 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451449 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:51.457608 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451452 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451454 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451457 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451459 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451462 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451465 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451468 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451470 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451473 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451476 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451478 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451481 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451484 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451487 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451489 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451492 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451494 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451496 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451499 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451502 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:51.458162 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451504 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451507 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451509 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451512 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451516 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451518 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451521 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451524 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451527 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451530 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451534 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451537 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451540 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451542 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451545 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451547 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451549 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451554 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451557 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:51.458875 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451560 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451562 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451564 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451567 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451569 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451572 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451575 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451577 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451580 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451582 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451585 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451587 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451590 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451592 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451595 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451598 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451600 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451604 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451607 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451609 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:51.459451 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451612 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:51.459943 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451615 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:51.459943 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451618 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:51.459943 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451620 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:51.459943 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451623 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:51.459943 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.451625 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:51.459943 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.452300 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:51.460101 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.460052 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:38:51.460101 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.460069 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460139 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460144 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460148 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460151 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460154 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460157 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460161 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460165 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460167 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460170 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460173 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460175 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460179 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:51.460175 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460181 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460184 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460187 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460190 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460193 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460195 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460198 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460200 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460203 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460205 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460208 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460210 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460213 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460215 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460218 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460221 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460224 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460227 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460229 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460232 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:51.460513 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460234 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460238 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460243 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460246 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460248 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460251 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460254 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460257 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460259 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460262 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460264 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460267 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460269 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460271 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460275 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460278 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460281 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460284 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460286 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:51.461014 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460289 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460291 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460294 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460297 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460299 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460302 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460305 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460307 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460310 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460312 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460315 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460318 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460320 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460323 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460326 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460328 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460331 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460333 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460336 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460338 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:51.461556 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460341 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460344 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460346 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460349 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460352 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460354 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460357 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460360 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460363 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460366 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460368 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460371 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460373 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460376 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.460382 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:51.462061 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460488 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460494 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460498 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460500 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460503 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460506 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460509 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460512 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460515 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460517 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460521 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460524 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460526 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460529 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460532 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460534 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460537 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460540 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460542 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460545 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:51.462500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460548 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460551 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460554 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460557 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460559 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460562 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460564 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460567 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460570 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460573 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460576 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460578 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460581 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460583 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460586 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460588 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460590 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460593 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460596 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:51.462984 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460598 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460601 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460603 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460606 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460609 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460611 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460614 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460616 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460618 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460621 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460624 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460626 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460629 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460631 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460634 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460637 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460640 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460643 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460645 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460648 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:51.463473 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460650 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460652 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460655 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460657 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460660 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460663 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460665 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460669 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460672 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460674 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460678 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460681 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460685 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460688 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460691 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460694 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460697 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460699 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460702 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:51.463957 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460705 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460708 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460710 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460713 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460715 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460718 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460721 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:51.460724 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.460729 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:51.464419 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.461483 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:38:51.465997 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.465982 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:38:51.467051 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.467039 2561 server.go:1019] "Starting client certificate rotation" Apr 24 16:38:51.467161 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.467145 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:38:51.467197 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.467190 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:38:51.489901 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.489882 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:38:51.492255 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.492228 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:38:51.502753 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.502730 2561 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:38:51.508644 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.508628 2561 log.go:25] "Validated CRI v1 image API" Apr 24 16:38:51.511215 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.511193 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:38:51.515594 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.515570 2561 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7e6357ee-ee2c-4771-a11a-4a2ae5c176dd:/dev/nvme0n1p4 d234b390-bbe2-447e-ac58-185f95156b71:/dev/nvme0n1p3] Apr 24 16:38:51.515674 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.515592 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:38:51.522205 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.522187 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:38:51.523131 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.523011 2561 manager.go:217] Machine: {Timestamp:2026-04-24 16:38:51.521213576 +0000 UTC m=+0.374084281 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3093999 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24ef948a87e46ba3014e366a5b4d13 SystemUUID:ec24ef94-8a87-e46b-a301-4e366a5b4d13 BootID:cde1c0e1-9e79-4790-9121-0c974163484c Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f4:b9:c1:52:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f4:b9:c1:52:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:6b:dc:f1:df:aa Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:38:51.523131 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.523129 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:38:51.523245 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.523218 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:38:51.523559 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.523538 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:38:51.523693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.523560 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-44.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:38:51.523734 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.523703 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:38:51.523734 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.523712 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:38:51.523734 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.523725 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:38:51.524517 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.524504 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:38:51.525896 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.525886 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:38:51.526006 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.525997 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:38:51.528135 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.528125 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:38:51.528175 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.528140 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:38:51.528175 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.528153 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:38:51.528175 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.528162 2561 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:38:51.528175 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.528170 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:38:51.529111 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.529099 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:38:51.529184 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.529131 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:38:51.532246 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.532229 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:38:51.533567 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.533553 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:38:51.535303 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535292 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:38:51.535348 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535309 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:38:51.535348 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535315 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:38:51.535348 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535321 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:38:51.535348 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535329 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:38:51.535348 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535337 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:38:51.535348 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535347 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:38:51.535535 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535357 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:38:51.535535 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535365 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:38:51.535535 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535371 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:38:51.535535 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535419 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:38:51.535535 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.535434 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:38:51.536230 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.536218 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:38:51.536266 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.536233 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:38:51.539917 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.539903 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:38:51.539968 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.539938 2561 server.go:1295] "Started kubelet" Apr 24 16:38:51.540020 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.539991 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:38:51.540074 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.540033 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:38:51.540108 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.540096 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:38:51.540813 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.540791 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:38:51.540902 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.540888 2561 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-44.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:38:51.540902 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.540863 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-44.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:38:51.540898 ip-10-0-128-44 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:38:51.541236 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.541158 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:38:51.541981 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.541965 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:38:51.544994 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.544972 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4fdsm" Apr 24 16:38:51.545355 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.544341 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-44.ec2.internal.18a95867f7fec262 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-44.ec2.internal,UID:ip-10-0-128-44.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-44.ec2.internal,},FirstTimestamp:2026-04-24 16:38:51.539915362 +0000 UTC m=+0.392786067,LastTimestamp:2026-04-24 16:38:51.539915362 +0000 UTC m=+0.392786067,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-44.ec2.internal,}" Apr 24 16:38:51.547068 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.547054 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:38:51.547163 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.547072 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:38:51.548096 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.547782 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:38:51.548096 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.547804 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:38:51.548096 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.547909 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:38:51.548096 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.547919 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:38:51.548096 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.547961 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:38:51.548096 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.547985 2561 factory.go:55] Registering systemd factory Apr 24 16:38:51.548096 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.548044 2561 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:38:51.548096 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.548063 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:51.548486 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.548476 2561 factory.go:153] Registering CRI-O factory Apr 24 16:38:51.548546 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.548490 2561 factory.go:223] Registration of the crio container factory successfully Apr 24 16:38:51.548595 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.548548 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:38:51.548595 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.548574 2561 factory.go:103] Registering Raw factory Apr 24 16:38:51.548595 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.548589 2561 manager.go:1196] Started watching for new ooms in manager Apr 24 16:38:51.549819 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.549803 2561 manager.go:319] Starting recovery of all containers Apr 24 16:38:51.551067 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.551036 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:38:51.557871 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.554316 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4fdsm" Apr 24 16:38:51.558894 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.558717 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-44.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 16:38:51.559207 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.559177 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 16:38:51.562359 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.562343 2561 manager.go:324] Recovery completed Apr 24 16:38:51.566536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.566519 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:51.569405 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.569386 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:51.569466 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.569418 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:51.569466 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.569428 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:51.569890 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.569878 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:38:51.569890 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.569887 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:38:51.569988 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.569904 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:38:51.571264 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.571193 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-44.ec2.internal.18a95867f9c0bbb2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-44.ec2.internal,UID:ip-10-0-128-44.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-44.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-44.ec2.internal,},FirstTimestamp:2026-04-24 16:38:51.56940485 +0000 UTC m=+0.422275556,LastTimestamp:2026-04-24 16:38:51.56940485 +0000 UTC m=+0.422275556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-44.ec2.internal,}" Apr 24 16:38:51.572481 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.572469 2561 policy_none.go:49] "None policy: Start" Apr 24 16:38:51.572542 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.572485 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:38:51.572542 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.572495 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:38:51.623862 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.623845 2561 manager.go:341] "Starting Device Plugin manager" Apr 24 16:38:51.623993 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.623893 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:38:51.623993 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.623904 2561 server.go:85] "Starting device plugin registration server" Apr 24 16:38:51.624234 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.624218 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:38:51.624352 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.624233 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:38:51.624352 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.624338 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:38:51.624561 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.624445 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:38:51.624561 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.624464 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:38:51.625386 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.625365 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:38:51.625478 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.625414 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:51.644832 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.644794 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:38:51.646005 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.645981 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:38:51.646005 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.646010 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:38:51.646181 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.646041 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:38:51.646181 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.646050 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:38:51.646181 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.646087 2561 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:38:51.648538 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.648511 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:51.724755 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.724644 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:51.725858 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.725840 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:51.725965 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.725874 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:51.725965 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.725887 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:51.725965 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.725910 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.734113 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.734097 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.734198 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.734133 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-44.ec2.internal\": node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:51.746352 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.746330 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal"] Apr 24 16:38:51.746461 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.746382 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:51.746461 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.746398 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:51.747419 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.747404 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:51.747479 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.747432 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:51.747479 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.747443 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:51.748715 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.748703 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:51.748850 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.748837 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.748887 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.748867 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:51.749471 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.749451 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:51.749556 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.749480 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:51.749556 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.749490 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:51.749556 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.749455 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:51.749556 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.749556 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:51.749690 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.749567 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:51.750769 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.750753 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.750833 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.750787 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:51.752377 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.752359 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:51.752456 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.752386 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:51.752456 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.752399 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:51.772778 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.772750 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-44.ec2.internal\" not found" node="ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.776696 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.776682 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-44.ec2.internal\" not found" node="ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.846950 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.846917 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:51.849212 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.849189 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/883e2fac0a836c5267813802443d2ab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal\" (UID: \"883e2fac0a836c5267813802443d2ab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.849283 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.849220 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c14f0193c74d30d8fc8bec912702c00-config\") pod \"kube-apiserver-proxy-ip-10-0-128-44.ec2.internal\" (UID: \"7c14f0193c74d30d8fc8bec912702c00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.849283 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.849237 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/883e2fac0a836c5267813802443d2ab5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal\" (UID: \"883e2fac0a836c5267813802443d2ab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.947803 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:51.947752 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:51.949990 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.949970 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/883e2fac0a836c5267813802443d2ab5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal\" (UID: \"883e2fac0a836c5267813802443d2ab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.950075 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.950007 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/883e2fac0a836c5267813802443d2ab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal\" (UID: \"883e2fac0a836c5267813802443d2ab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.950075 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.950031 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c14f0193c74d30d8fc8bec912702c00-config\") pod \"kube-apiserver-proxy-ip-10-0-128-44.ec2.internal\" (UID: \"7c14f0193c74d30d8fc8bec912702c00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.950075 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.950066 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c14f0193c74d30d8fc8bec912702c00-config\") pod \"kube-apiserver-proxy-ip-10-0-128-44.ec2.internal\" (UID: \"7c14f0193c74d30d8fc8bec912702c00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.950210 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.950068 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/883e2fac0a836c5267813802443d2ab5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal\" (UID: \"883e2fac0a836c5267813802443d2ab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:51.950210 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:51.950068 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/883e2fac0a836c5267813802443d2ab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal\" (UID: \"883e2fac0a836c5267813802443d2ab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:52.048591 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.048518 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.074806 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.074790 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:52.079391 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.079374 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" Apr 24 16:38:52.149251 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.149211 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.249743 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.249704 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.350323 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.350243 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.450807 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.450779 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.467085 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.467063 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:38:52.467283 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.467216 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:38:52.547384 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.547356 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:38:52.550894 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.550873 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.560667 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.560632 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:33:51 +0000 UTC" deadline="2027-11-16 19:58:33.730939327 +0000 UTC" Apr 24 16:38:52.560667 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.560665 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13707h19m41.170277579s" Apr 24 16:38:52.561578 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.561563 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:38:52.565953 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:52.565920 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod883e2fac0a836c5267813802443d2ab5.slice/crio-49384903a4ced2f4948d59ce95d98efb436fda718b8b170c4376607cbd2f7c1e WatchSource:0}: Error finding container 49384903a4ced2f4948d59ce95d98efb436fda718b8b170c4376607cbd2f7c1e: Status 404 returned error can't find the container with id 49384903a4ced2f4948d59ce95d98efb436fda718b8b170c4376607cbd2f7c1e Apr 24 16:38:52.566538 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:52.566519 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c14f0193c74d30d8fc8bec912702c00.slice/crio-8592cfed2620b0afe62a8709203e69ac27c8c786b1f5502ceae67adb6c7ae650 WatchSource:0}: Error finding container 8592cfed2620b0afe62a8709203e69ac27c8c786b1f5502ceae67adb6c7ae650: Status 404 returned error can't find the container with id 8592cfed2620b0afe62a8709203e69ac27c8c786b1f5502ceae67adb6c7ae650 Apr 24 16:38:52.570543 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.570530 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:38:52.596509 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.596483 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-r4nj5" Apr 24 16:38:52.602817 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.602750 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-r4nj5" Apr 24 16:38:52.649357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.649307 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" event={"ID":"7c14f0193c74d30d8fc8bec912702c00","Type":"ContainerStarted","Data":"8592cfed2620b0afe62a8709203e69ac27c8c786b1f5502ceae67adb6c7ae650"} Apr 24 16:38:52.650222 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.650195 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" event={"ID":"883e2fac0a836c5267813802443d2ab5","Type":"ContainerStarted","Data":"49384903a4ced2f4948d59ce95d98efb436fda718b8b170c4376607cbd2f7c1e"} Apr 24 16:38:52.651280 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.651265 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.751875 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.751846 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.850193 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.850167 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:52.851985 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:52.851957 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-44.ec2.internal\" not found" Apr 24 16:38:52.895129 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.895041 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:52.941460 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.941435 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:52.947469 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.947450 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" Apr 24 16:38:52.960157 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.960133 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:38:52.961083 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.961070 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" Apr 24 16:38:52.970398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:52.970142 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:38:53.529055 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.529018 2561 apiserver.go:52] "Watching apiserver" Apr 24 16:38:53.536644 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.536621 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:38:53.537054 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.537027 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-9bw4h","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz","openshift-multus/multus-additional-cni-plugins-k7nht","openshift-multus/network-metrics-daemon-6c75x","openshift-ovn-kubernetes/ovnkube-node-rrwzc","kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal","openshift-cluster-node-tuning-operator/tuned-hkhzt","openshift-image-registry/node-ca-wcmwg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal","openshift-multus/multus-5qxgh","openshift-network-diagnostics/network-check-target-2p9kk","openshift-network-operator/iptables-alerter-5bm47"] Apr 24 16:38:53.539160 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.539108 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:38:53.540321 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.540300 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.541597 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.541572 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.541597 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.541582 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:38:53.541751 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.541581 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mtr2z\"" Apr 24 16:38:53.541751 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.541643 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:38:53.542405 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.542380 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:38:53.542518 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.542442 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xqkrc\"" Apr 24 16:38:53.542518 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.542453 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:38:53.542726 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.542705 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:38:53.542828 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.542812 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:53.542898 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:53.542876 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:38:53.543880 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.543862 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:38:53.544333 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.544185 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.544333 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.544198 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:38:53.544333 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.544296 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:38:53.544510 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.544370 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:38:53.544510 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.544422 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:38:53.544776 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.544762 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-q4njf\"" Apr 24 16:38:53.545374 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.545356 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.546448 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.546432 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.547861 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.547828 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.548492 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.548469 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7t26t\"" Apr 24 16:38:53.548866 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.548776 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:38:53.548866 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.548777 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:38:53.550756 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.549977 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:38:53.550756 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.550198 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:38:53.550756 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.550208 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:38:53.550756 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.550405 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:38:53.550756 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.550425 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:53.550756 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:53.550659 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:38:53.551060 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.550920 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dzdl9\"" Apr 24 16:38:53.551398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.551170 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:38:53.551398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.551286 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:38:53.553225 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.553208 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.553404 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.553374 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:38:53.557952 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.557922 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-sys-fs\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.558054 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.557966 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-env-overrides\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.558054 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.557993 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-k8s-cni-cncf-io\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.558054 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558044 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.558232 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558145 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-cni-multus\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.558232 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558178 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-sys\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.558232 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558151 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:38:53.558232 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558195 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:38:53.558420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558216 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbh9g\" (UniqueName: \"kubernetes.io/projected/0cf50135-6483-41fd-a681-60b2bfc1a66d-kube-api-access-mbh9g\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.558420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558293 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-ovn\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.558420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558318 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c71dad60-6f43-425f-9755-72c272d5116b-tmp\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.558420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558346 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-kubelet\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.558420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558370 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-etc-kubernetes\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.558420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558392 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-modprobe-d\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.558420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558416 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6cp7\" (UniqueName: \"kubernetes.io/projected/c71dad60-6f43-425f-9755-72c272d5116b-kube-api-access-k6cp7\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558438 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-socket-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558457 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558467 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558459 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-etc-selinux\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558555 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-etc-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558577 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-cni-bin\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558600 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-os-release\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558623 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-cni-binary-copy\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558646 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-slash\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558668 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-registration-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558692 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-system-cni-dir\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.558739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558715 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-run-ovn-kubernetes\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558758 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8m94\" (UniqueName: \"kubernetes.io/projected/5461250a-930b-44a6-ba77-773586840e32-kube-api-access-l8m94\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558796 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-kubernetes\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558815 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558844 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558874 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558894 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-system-cni-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558916 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-cni-binary-copy\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.558948 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-multus-certs\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559001 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-var-lib-kubelet\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559032 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-device-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559074 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559101 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxx2z\" (UniqueName: \"kubernetes.io/projected/7abff685-57e5-4da6-b220-dbc6c56835ed-kube-api-access-vxx2z\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559110 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559146 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-run-netns\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559168 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-cni-netd\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559172 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4h7z6\"" Apr 24 16:38:53.559298 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559190 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-conf-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559211 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-host\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559234 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-systemd\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559258 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-var-lib-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559213 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559296 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5360254e-a924-4ddf-8043-9c43f8152fe3-agent-certs\") pod \"konnectivity-agent-9bw4h\" (UID: \"5360254e-a924-4ddf-8043-9c43f8152fe3\") " pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559351 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hpbqv\"" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559350 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5360254e-a924-4ddf-8043-9c43f8152fe3-konnectivity-ca\") pod \"konnectivity-agent-9bw4h\" (UID: \"5360254e-a924-4ddf-8043-9c43f8152fe3\") " pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559396 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559419 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-os-release\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559440 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-log-socket\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559463 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-cni-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559487 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gch6\" (UniqueName: \"kubernetes.io/projected/03b3f80f-e0be-4a62-92c7-eba566136dc7-kube-api-access-5gch6\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559490 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tr4zg\"" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559511 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovn-node-metrics-cert\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559549 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovnkube-script-lib\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559573 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g75d\" (UniqueName: \"kubernetes.io/projected/57df94e3-357d-4303-9788-f5e7a9d03ae9-kube-api-access-4g75d\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559596 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-socket-dir-parent\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.559966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559625 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-run\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559648 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-cnibin\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559682 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-systemd-units\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559710 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-hostroot\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559744 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovnkube-config\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559762 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5461250a-930b-44a6-ba77-773586840e32-host\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559777 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysconfig\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559791 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-node-log\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559810 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-netns\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559839 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysctl-conf\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559859 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-daemon-config\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559876 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75l9n\" (UniqueName: \"kubernetes.io/projected/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-kube-api-access-75l9n\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559890 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559910 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c71dad60-6f43-425f-9755-72c272d5116b-etc-tuned\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.559932 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.560006 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5461250a-930b-44a6-ba77-773586840e32-serviceca\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.560085 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-cnibin\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.560688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.560136 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-cni-bin\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.561219 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.560166 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-kubelet\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.561219 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.560192 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysctl-d\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.561219 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.560218 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-systemd\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.561219 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.560243 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-lib-modules\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.577411 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.577387 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:53.604041 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.604013 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:33:52 +0000 UTC" deadline="2028-01-24 00:09:16.793029326 +0000 UTC" Apr 24 16:38:53.604041 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.604038 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15343h30m23.188994146s" Apr 24 16:38:53.648973 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.648944 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:38:53.661150 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661103 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.661304 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661164 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5461250a-930b-44a6-ba77-773586840e32-serviceca\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.661304 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661195 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-cnibin\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661304 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661225 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-cni-bin\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661304 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661290 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-kubelet\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661313 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-cni-bin\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661329 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-cnibin\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661226 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661318 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysctl-d\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661382 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-kubelet\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661427 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-systemd\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661455 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-lib-modules\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661479 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-systemd\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.661506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661493 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysctl-d\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661560 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-sys-fs\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661590 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-sys-fs\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661610 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-lib-modules\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661628 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-env-overrides\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661653 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-k8s-cni-cncf-io\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661673 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5461250a-930b-44a6-ba77-773586840e32-serviceca\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661685 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df14d60b-71e6-490b-a1cb-a94cefccd438-host-slash\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661716 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661749 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-cni-multus\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661771 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-sys\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661790 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-k8s-cni-cncf-io\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661799 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbh9g\" (UniqueName: \"kubernetes.io/projected/0cf50135-6483-41fd-a681-60b2bfc1a66d-kube-api-access-mbh9g\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661827 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-ovn\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661847 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-var-lib-cni-multus\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.661923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.661910 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-ovn\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662264 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c71dad60-6f43-425f-9755-72c272d5116b-tmp\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662302 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhm7v\" (UniqueName: \"kubernetes.io/projected/df14d60b-71e6-490b-a1cb-a94cefccd438-kube-api-access-dhm7v\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662345 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-kubelet\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662360 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-etc-kubernetes\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662379 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-modprobe-d\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662416 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6cp7\" (UniqueName: \"kubernetes.io/projected/c71dad60-6f43-425f-9755-72c272d5116b-kube-api-access-k6cp7\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662446 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-kubelet\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662448 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-etc-kubernetes\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662518 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-sys\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662531 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662595 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-modprobe-d\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.662657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662642 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-socket-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.663143 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662680 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-etc-selinux\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.663143 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662705 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-etc-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.663143 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662727 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-cni-bin\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.663143 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.662826 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-cni-bin\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.663889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663505 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-etc-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.663889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663540 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-socket-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.663889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663601 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-etc-selinux\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.663889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663438 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-env-overrides\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.663889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663782 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-os-release\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.663889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663824 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-cni-binary-copy\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.663889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663842 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.664323 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663910 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-os-release\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.664323 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.663972 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-slash\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.664323 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664046 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-registration-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.664323 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664148 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-system-cni-dir\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.664323 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664221 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-run-ovn-kubernetes\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.664323 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664251 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8m94\" (UniqueName: \"kubernetes.io/projected/5461250a-930b-44a6-ba77-773586840e32-kube-api-access-l8m94\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.664323 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664295 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-kubernetes\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.664323 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664311 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-system-cni-dir\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664373 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-kubernetes\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664390 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-registration-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664454 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664500 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664509 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-run-ovn-kubernetes\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664549 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664620 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-system-cni-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664637 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-cni-binary-copy\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664646 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-slash\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.664688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664650 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-cni-binary-copy\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.665106 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664743 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-multus-certs\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.665106 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664758 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-system-cni-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.665106 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664802 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.665106 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:53.664897 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:53.665106 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664923 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-multus-certs\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.665106 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.664978 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-var-lib-kubelet\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.665106 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665056 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-device-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.665446 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665136 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.665446 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665228 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxx2z\" (UniqueName: \"kubernetes.io/projected/7abff685-57e5-4da6-b220-dbc6c56835ed-kube-api-access-vxx2z\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:53.665446 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665264 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0cf50135-6483-41fd-a681-60b2bfc1a66d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.665446 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665291 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-run-netns\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.665446 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665325 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-cni-netd\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.665446 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665385 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-conf-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.665446 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665411 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-device-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.665446 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665419 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-host\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.665777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665447 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-cni-binary-copy\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.665777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665517 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-var-lib-kubelet\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.665777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665451 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-systemd\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.665777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665616 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-run-netns\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.665777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665635 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-var-lib-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.665777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665683 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5360254e-a924-4ddf-8043-9c43f8152fe3-agent-certs\") pod \"konnectivity-agent-9bw4h\" (UID: \"5360254e-a924-4ddf-8043-9c43f8152fe3\") " pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:38:53.665777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665700 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.665777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665732 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5360254e-a924-4ddf-8043-9c43f8152fe3-konnectivity-ca\") pod \"konnectivity-agent-9bw4h\" (UID: \"5360254e-a924-4ddf-8043-9c43f8152fe3\") " pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:38:53.666151 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665823 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-host\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.666151 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665883 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-host-cni-netd\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.666151 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665891 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.666151 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665940 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-conf-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.666151 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665953 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-os-release\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.666151 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.665980 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-log-socket\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.666151 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666014 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-cni-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.666151 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666098 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-os-release\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.666509 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666171 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-log-socket\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.666509 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666179 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-cni-dir\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.666509 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666260 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-run-systemd\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.666509 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666323 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-var-lib-openvswitch\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.666509 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666448 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gch6\" (UniqueName: \"kubernetes.io/projected/03b3f80f-e0be-4a62-92c7-eba566136dc7-kube-api-access-5gch6\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.666509 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666488 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovn-node-metrics-cert\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666521 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovnkube-script-lib\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666496 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03b3f80f-e0be-4a62-92c7-eba566136dc7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666557 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4g75d\" (UniqueName: \"kubernetes.io/projected/57df94e3-357d-4303-9788-f5e7a9d03ae9-kube-api-access-4g75d\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666583 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-socket-dir-parent\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666628 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-run\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666684 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/df14d60b-71e6-490b-a1cb-a94cefccd438-iptables-alerter-script\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.666721 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-cnibin\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667089 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-systemd-units\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667178 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-hostroot\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667202 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-socket-dir-parent\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667207 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovnkube-config\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667245 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5461250a-930b-44a6-ba77-773586840e32-host\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667293 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysconfig\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667319 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-node-log\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667352 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-netns\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667411 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysctl-conf\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.668325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667445 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-daemon-config\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667492 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75l9n\" (UniqueName: \"kubernetes.io/projected/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-kube-api-access-75l9n\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667518 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667550 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c71dad60-6f43-425f-9755-72c272d5116b-etc-tuned\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667609 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c71dad60-6f43-425f-9755-72c272d5116b-tmp\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667697 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovnkube-config\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667790 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0cf50135-6483-41fd-a681-60b2bfc1a66d-cnibin\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667852 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5360254e-a924-4ddf-8043-9c43f8152fe3-konnectivity-ca\") pod \"konnectivity-agent-9bw4h\" (UID: \"5360254e-a924-4ddf-8043-9c43f8152fe3\") " pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667935 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-systemd-units\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.667996 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5461250a-930b-44a6-ba77-773586840e32-host\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.668052 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-run\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.668062 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-hostroot\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.668110 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/57df94e3-357d-4303-9788-f5e7a9d03ae9-node-log\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:53.668157 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs podName:7abff685-57e5-4da6-b220-dbc6c56835ed nodeName:}" failed. No retries permitted until 2026-04-24 16:38:54.168105636 +0000 UTC m=+3.020976348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs") pod "network-metrics-daemon-6c75x" (UID: "7abff685-57e5-4da6-b220-dbc6c56835ed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.668236 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysconfig\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.668707 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovnkube-script-lib\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.669086 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-host-run-netns\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.669383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.669367 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c71dad60-6f43-425f-9755-72c272d5116b-etc-sysctl-conf\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.670628 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.670472 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-multus-daemon-config\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.671069 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.671026 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c71dad60-6f43-425f-9755-72c272d5116b-etc-tuned\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.672945 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.672794 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/57df94e3-357d-4303-9788-f5e7a9d03ae9-ovn-node-metrics-cert\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.673385 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.673333 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5360254e-a924-4ddf-8043-9c43f8152fe3-agent-certs\") pod \"konnectivity-agent-9bw4h\" (UID: \"5360254e-a924-4ddf-8043-9c43f8152fe3\") " pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:38:53.673477 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.673429 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbh9g\" (UniqueName: \"kubernetes.io/projected/0cf50135-6483-41fd-a681-60b2bfc1a66d-kube-api-access-mbh9g\") pod \"multus-additional-cni-plugins-k7nht\" (UID: \"0cf50135-6483-41fd-a681-60b2bfc1a66d\") " pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.673967 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.673920 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6cp7\" (UniqueName: \"kubernetes.io/projected/c71dad60-6f43-425f-9755-72c272d5116b-kube-api-access-k6cp7\") pod \"tuned-hkhzt\" (UID: \"c71dad60-6f43-425f-9755-72c272d5116b\") " pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.674226 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.674207 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxx2z\" (UniqueName: \"kubernetes.io/projected/7abff685-57e5-4da6-b220-dbc6c56835ed-kube-api-access-vxx2z\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:53.675370 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.675349 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8m94\" (UniqueName: \"kubernetes.io/projected/5461250a-930b-44a6-ba77-773586840e32-kube-api-access-l8m94\") pod \"node-ca-wcmwg\" (UID: \"5461250a-930b-44a6-ba77-773586840e32\") " pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.678013 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:53.677996 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:53.678013 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:53.678017 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:53.678164 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:53.678026 2561 projected.go:194] Error preparing data for projected volume kube-api-access-zhj7k for pod openshift-network-diagnostics/network-check-target-2p9kk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:53.678164 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:53.678106 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k podName:f32abd88-b479-49d4-b013-a952386d695c nodeName:}" failed. No retries permitted until 2026-04-24 16:38:54.178088768 +0000 UTC m=+3.030959463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zhj7k" (UniqueName: "kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k") pod "network-check-target-2p9kk" (UID: "f32abd88-b479-49d4-b013-a952386d695c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:53.679771 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.679752 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75l9n\" (UniqueName: \"kubernetes.io/projected/3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2-kube-api-access-75l9n\") pod \"multus-5qxgh\" (UID: \"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2\") " pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.680439 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.680422 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gch6\" (UniqueName: \"kubernetes.io/projected/03b3f80f-e0be-4a62-92c7-eba566136dc7-kube-api-access-5gch6\") pod \"aws-ebs-csi-driver-node-vpkmz\" (UID: \"03b3f80f-e0be-4a62-92c7-eba566136dc7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.684601 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.684580 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g75d\" (UniqueName: \"kubernetes.io/projected/57df94e3-357d-4303-9788-f5e7a9d03ae9-kube-api-access-4g75d\") pod \"ovnkube-node-rrwzc\" (UID: \"57df94e3-357d-4303-9788-f5e7a9d03ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.767941 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.767906 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df14d60b-71e6-490b-a1cb-a94cefccd438-host-slash\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.767941 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.767947 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhm7v\" (UniqueName: \"kubernetes.io/projected/df14d60b-71e6-490b-a1cb-a94cefccd438-kube-api-access-dhm7v\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.768199 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.768029 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df14d60b-71e6-490b-a1cb-a94cefccd438-host-slash\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.768199 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.768127 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/df14d60b-71e6-490b-a1cb-a94cefccd438-iptables-alerter-script\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.768730 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.768703 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/df14d60b-71e6-490b-a1cb-a94cefccd438-iptables-alerter-script\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.776301 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.776277 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhm7v\" (UniqueName: \"kubernetes.io/projected/df14d60b-71e6-490b-a1cb-a94cefccd438-kube-api-access-dhm7v\") pod \"iptables-alerter-5bm47\" (UID: \"df14d60b-71e6-490b-a1cb-a94cefccd438\") " pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:53.852474 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.852386 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:38:53.859196 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.859166 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" Apr 24 16:38:53.867872 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.867848 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k7nht" Apr 24 16:38:53.872493 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.872473 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:38:53.879097 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.879071 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" Apr 24 16:38:53.884740 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.884718 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wcmwg" Apr 24 16:38:53.891292 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.891269 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5qxgh" Apr 24 16:38:53.897931 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:53.897906 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5bm47" Apr 24 16:38:54.169968 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.169868 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:54.170145 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:54.170048 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:54.170145 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:54.170135 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs podName:7abff685-57e5-4da6-b220-dbc6c56835ed nodeName:}" failed. No retries permitted until 2026-04-24 16:38:55.170103154 +0000 UTC m=+4.022973857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs") pod "network-metrics-daemon-6c75x" (UID: "7abff685-57e5-4da6-b220-dbc6c56835ed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:54.264075 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:54.264009 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5461250a_930b_44a6_ba77_773586840e32.slice/crio-1b6663db0fb7abdec47cd10924531c678de0ef9c9109435c95a819865ca67bbd WatchSource:0}: Error finding container 1b6663db0fb7abdec47cd10924531c678de0ef9c9109435c95a819865ca67bbd: Status 404 returned error can't find the container with id 1b6663db0fb7abdec47cd10924531c678de0ef9c9109435c95a819865ca67bbd Apr 24 16:38:54.265551 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:54.265397 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf14d60b_71e6_490b_a1cb_a94cefccd438.slice/crio-812d993f09b1ff3b09ca7807e3a3036ae15c2be512685ebb7ad66854fe4a333f WatchSource:0}: Error finding container 812d993f09b1ff3b09ca7807e3a3036ae15c2be512685ebb7ad66854fe4a333f: Status 404 returned error can't find the container with id 812d993f09b1ff3b09ca7807e3a3036ae15c2be512685ebb7ad66854fe4a333f Apr 24 16:38:54.267980 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:54.267959 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c59ab7c_c1d6_4ca5_b111_ef7591ea2ea2.slice/crio-76b57cfc9ba2b28e6a0c938bcac1bd6ff32773ee2715d13d14cf04293323ad06 WatchSource:0}: Error finding container 76b57cfc9ba2b28e6a0c938bcac1bd6ff32773ee2715d13d14cf04293323ad06: Status 404 returned error can't find the container with id 76b57cfc9ba2b28e6a0c938bcac1bd6ff32773ee2715d13d14cf04293323ad06 Apr 24 16:38:54.269237 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:54.269217 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc71dad60_6f43_425f_9755_72c272d5116b.slice/crio-199985d3f209d39a5ad5be21c6a1c43aa3b16053861bde32e17aee03ea754e60 WatchSource:0}: Error finding container 199985d3f209d39a5ad5be21c6a1c43aa3b16053861bde32e17aee03ea754e60: Status 404 returned error can't find the container with id 199985d3f209d39a5ad5be21c6a1c43aa3b16053861bde32e17aee03ea754e60 Apr 24 16:38:54.270466 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:54.270173 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf50135_6483_41fd_a681_60b2bfc1a66d.slice/crio-bd4b854c52127e4021c3bdc6504b0ef550e365770bc6afaf5dce90eacab3e7ac WatchSource:0}: Error finding container bd4b854c52127e4021c3bdc6504b0ef550e365770bc6afaf5dce90eacab3e7ac: Status 404 returned error can't find the container with id bd4b854c52127e4021c3bdc6504b0ef550e365770bc6afaf5dce90eacab3e7ac Apr 24 16:38:54.270466 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.270207 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:54.270466 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:54.270367 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:54.270466 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:54.270383 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:54.270466 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:54.270396 2561 projected.go:194] Error preparing data for projected volume kube-api-access-zhj7k for pod openshift-network-diagnostics/network-check-target-2p9kk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:54.270466 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:54.270444 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k podName:f32abd88-b479-49d4-b013-a952386d695c nodeName:}" failed. No retries permitted until 2026-04-24 16:38:55.270427088 +0000 UTC m=+4.123297785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhj7k" (UniqueName: "kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k") pod "network-check-target-2p9kk" (UID: "f32abd88-b479-49d4-b013-a952386d695c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:54.271680 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:54.271361 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5360254e_a924_4ddf_8043_9c43f8152fe3.slice/crio-fbbff860aef9e0fde68b07c103a8a0141c5011ec8fe602a607ac219e21d2f200 WatchSource:0}: Error finding container fbbff860aef9e0fde68b07c103a8a0141c5011ec8fe602a607ac219e21d2f200: Status 404 returned error can't find the container with id fbbff860aef9e0fde68b07c103a8a0141c5011ec8fe602a607ac219e21d2f200 Apr 24 16:38:54.271969 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:54.271823 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03b3f80f_e0be_4a62_92c7_eba566136dc7.slice/crio-140c2955f8b6eb8adcb4567739a255e5fa4c20c9fda645ffcf7b047a3645abbf WatchSource:0}: Error finding container 140c2955f8b6eb8adcb4567739a255e5fa4c20c9fda645ffcf7b047a3645abbf: Status 404 returned error can't find the container with id 140c2955f8b6eb8adcb4567739a255e5fa4c20c9fda645ffcf7b047a3645abbf Apr 24 16:38:54.273850 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:38:54.273105 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57df94e3_357d_4303_9788_f5e7a9d03ae9.slice/crio-4cd3d878cd960513a75b14b2046ec537923d70df1d4704eeefd8e64ff7d03d00 WatchSource:0}: Error finding container 4cd3d878cd960513a75b14b2046ec537923d70df1d4704eeefd8e64ff7d03d00: Status 404 returned error can't find the container with id 4cd3d878cd960513a75b14b2046ec537923d70df1d4704eeefd8e64ff7d03d00 Apr 24 16:38:54.604326 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.604204 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:33:52 +0000 UTC" deadline="2027-11-24 21:37:41.982454694 +0000 UTC" Apr 24 16:38:54.604326 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.604247 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13900h58m47.378211793s" Apr 24 16:38:54.647423 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.647098 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:54.647423 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:54.647244 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:38:54.656060 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.656015 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" event={"ID":"03b3f80f-e0be-4a62-92c7-eba566136dc7","Type":"ContainerStarted","Data":"140c2955f8b6eb8adcb4567739a255e5fa4c20c9fda645ffcf7b047a3645abbf"} Apr 24 16:38:54.658177 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.658143 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerStarted","Data":"bd4b854c52127e4021c3bdc6504b0ef550e365770bc6afaf5dce90eacab3e7ac"} Apr 24 16:38:54.660348 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.660319 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5qxgh" event={"ID":"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2","Type":"ContainerStarted","Data":"76b57cfc9ba2b28e6a0c938bcac1bd6ff32773ee2715d13d14cf04293323ad06"} Apr 24 16:38:54.662930 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.662893 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wcmwg" event={"ID":"5461250a-930b-44a6-ba77-773586840e32","Type":"ContainerStarted","Data":"1b6663db0fb7abdec47cd10924531c678de0ef9c9109435c95a819865ca67bbd"} Apr 24 16:38:54.666277 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.666246 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" event={"ID":"7c14f0193c74d30d8fc8bec912702c00","Type":"ContainerStarted","Data":"012241f22e72378d2ced376a565b2b47955cb10da43d47a974fb284a31232c08"} Apr 24 16:38:54.670351 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.670287 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"4cd3d878cd960513a75b14b2046ec537923d70df1d4704eeefd8e64ff7d03d00"} Apr 24 16:38:54.672149 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.672098 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9bw4h" event={"ID":"5360254e-a924-4ddf-8043-9c43f8152fe3","Type":"ContainerStarted","Data":"fbbff860aef9e0fde68b07c103a8a0141c5011ec8fe602a607ac219e21d2f200"} Apr 24 16:38:54.678733 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.678673 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" event={"ID":"c71dad60-6f43-425f-9755-72c272d5116b","Type":"ContainerStarted","Data":"199985d3f209d39a5ad5be21c6a1c43aa3b16053861bde32e17aee03ea754e60"} Apr 24 16:38:54.683217 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:54.683188 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5bm47" event={"ID":"df14d60b-71e6-490b-a1cb-a94cefccd438","Type":"ContainerStarted","Data":"812d993f09b1ff3b09ca7807e3a3036ae15c2be512685ebb7ad66854fe4a333f"} Apr 24 16:38:55.177004 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:55.176902 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:55.177192 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:55.177068 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:55.177192 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:55.177164 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs podName:7abff685-57e5-4da6-b220-dbc6c56835ed nodeName:}" failed. No retries permitted until 2026-04-24 16:38:57.177143938 +0000 UTC m=+6.030014645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs") pod "network-metrics-daemon-6c75x" (UID: "7abff685-57e5-4da6-b220-dbc6c56835ed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:55.278581 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:55.277976 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:55.278581 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:55.278138 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:55.278581 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:55.278157 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:55.278581 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:55.278170 2561 projected.go:194] Error preparing data for projected volume kube-api-access-zhj7k for pod openshift-network-diagnostics/network-check-target-2p9kk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:55.278581 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:55.278234 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k podName:f32abd88-b479-49d4-b013-a952386d695c nodeName:}" failed. No retries permitted until 2026-04-24 16:38:57.278216158 +0000 UTC m=+6.131086857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhj7k" (UniqueName: "kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k") pod "network-check-target-2p9kk" (UID: "f32abd88-b479-49d4-b013-a952386d695c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:55.653359 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:55.653247 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:55.653783 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:55.653493 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:38:55.716585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:55.716546 2561 generic.go:358] "Generic (PLEG): container finished" podID="883e2fac0a836c5267813802443d2ab5" containerID="f50651a98b914ea178fe4aad1fa7484e1ee4b855a276ede90d7c29a71b7b9e89" exitCode=0 Apr 24 16:38:55.717568 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:55.717541 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" event={"ID":"883e2fac0a836c5267813802443d2ab5","Type":"ContainerDied","Data":"f50651a98b914ea178fe4aad1fa7484e1ee4b855a276ede90d7c29a71b7b9e89"} Apr 24 16:38:55.731739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:55.731549 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-44.ec2.internal" podStartSLOduration=3.731531349 podStartE2EDuration="3.731531349s" podCreationTimestamp="2026-04-24 16:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:54.680906778 +0000 UTC m=+3.533777495" watchObservedRunningTime="2026-04-24 16:38:55.731531349 +0000 UTC m=+4.584402067" Apr 24 16:38:56.647066 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:56.646981 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:56.647227 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:56.647134 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:38:56.726221 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:56.726184 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" event={"ID":"883e2fac0a836c5267813802443d2ab5","Type":"ContainerStarted","Data":"48ca5e421da0bedf14f541c1c0c4a1013a1cae822c9a746b30bb2bfcf2ac70b6"} Apr 24 16:38:56.742933 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:56.742374 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-44.ec2.internal" podStartSLOduration=4.7423561450000005 podStartE2EDuration="4.742356145s" podCreationTimestamp="2026-04-24 16:38:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:56.742103347 +0000 UTC m=+5.594974061" watchObservedRunningTime="2026-04-24 16:38:56.742356145 +0000 UTC m=+5.595226863" Apr 24 16:38:57.193307 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:57.192706 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:57.193307 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:57.192890 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:57.193307 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:57.192951 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs podName:7abff685-57e5-4da6-b220-dbc6c56835ed nodeName:}" failed. No retries permitted until 2026-04-24 16:39:01.19293426 +0000 UTC m=+10.045804957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs") pod "network-metrics-daemon-6c75x" (UID: "7abff685-57e5-4da6-b220-dbc6c56835ed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:57.293811 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:57.293770 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:57.293996 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:57.293946 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:57.293996 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:57.293965 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:57.293996 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:57.293977 2561 projected.go:194] Error preparing data for projected volume kube-api-access-zhj7k for pod openshift-network-diagnostics/network-check-target-2p9kk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:57.294187 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:57.294040 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k podName:f32abd88-b479-49d4-b013-a952386d695c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:01.294022004 +0000 UTC m=+10.146892701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhj7k" (UniqueName: "kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k") pod "network-check-target-2p9kk" (UID: "f32abd88-b479-49d4-b013-a952386d695c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:57.647164 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:57.646957 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:57.647164 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:57.647099 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:38:58.647269 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:58.646768 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:38:58.647269 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:58.646895 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:38:59.646547 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:38:59.646512 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:38:59.646748 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:38:59.646650 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:00.646583 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:00.646528 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:00.647050 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:00.646669 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:01.226263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:01.226226 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:01.226391 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:01.226355 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:01.226452 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:01.226417 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs podName:7abff685-57e5-4da6-b220-dbc6c56835ed nodeName:}" failed. No retries permitted until 2026-04-24 16:39:09.226399327 +0000 UTC m=+18.079270019 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs") pod "network-metrics-daemon-6c75x" (UID: "7abff685-57e5-4da6-b220-dbc6c56835ed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:01.327944 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:01.327319 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:01.327944 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:01.327486 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:01.327944 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:01.327507 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:01.327944 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:01.327520 2561 projected.go:194] Error preparing data for projected volume kube-api-access-zhj7k for pod openshift-network-diagnostics/network-check-target-2p9kk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:01.327944 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:01.327578 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k podName:f32abd88-b479-49d4-b013-a952386d695c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:09.327559019 +0000 UTC m=+18.180429715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhj7k" (UniqueName: "kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k") pod "network-check-target-2p9kk" (UID: "f32abd88-b479-49d4-b013-a952386d695c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:01.647722 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:01.647222 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:01.647722 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:01.647347 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:02.646403 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:02.646367 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:02.646583 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:02.646513 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:03.646802 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:03.646766 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:03.647251 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:03.646914 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:04.646619 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:04.646583 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:04.646804 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:04.646711 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:05.646464 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:05.646428 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:05.646865 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:05.646553 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:06.646375 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:06.646302 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:06.646499 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:06.646414 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:07.646822 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:07.646755 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:07.647244 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:07.646935 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:08.646871 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:08.646832 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:08.647327 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:08.646968 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:09.280753 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:09.280714 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:09.280902 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:09.280837 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:09.280956 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:09.280919 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs podName:7abff685-57e5-4da6-b220-dbc6c56835ed nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.280877311 +0000 UTC m=+34.133748003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs") pod "network-metrics-daemon-6c75x" (UID: "7abff685-57e5-4da6-b220-dbc6c56835ed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:09.381599 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:09.381264 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:09.381599 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:09.381454 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:09.381599 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:09.381473 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:09.381599 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:09.381485 2561 projected.go:194] Error preparing data for projected volume kube-api-access-zhj7k for pod openshift-network-diagnostics/network-check-target-2p9kk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:09.381599 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:09.381541 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k podName:f32abd88-b479-49d4-b013-a952386d695c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.381521651 +0000 UTC m=+34.234392357 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhj7k" (UniqueName: "kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k") pod "network-check-target-2p9kk" (UID: "f32abd88-b479-49d4-b013-a952386d695c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:09.650284 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:09.650206 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:09.650703 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:09.650331 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:10.646403 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:10.646373 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:10.646555 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:10.646479 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:11.646938 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.646776 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:11.647457 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:11.647018 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:11.753187 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.753149 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"c0bec8ffd76bbd8c302da272700faf47862c57833224e84c87d0d1e68d050369"} Apr 24 16:39:11.753269 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.753189 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"94358bcbcd390ebb4323743c2f4065aede96bfc2c3bada86404cffc52c9d896f"} Apr 24 16:39:11.753269 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.753203 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"8346ca4f212aac5cf9d9d82c1d0ad80df7cb49a642e6c8cecdc4c6649ea01054"} Apr 24 16:39:11.753269 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.753212 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"0ab2a9978193ce249e38ebf7ae7b82634f05847a2c898b7453e7fcedcb28dfce"} Apr 24 16:39:11.753269 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.753221 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"489f4caf11320475dd030d4ded172ebe366b2e3406410649e82887c7adc5e4ed"} Apr 24 16:39:11.754388 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.754362 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9bw4h" event={"ID":"5360254e-a924-4ddf-8043-9c43f8152fe3","Type":"ContainerStarted","Data":"b2b54fb7944bfefe13e118fb5d459d3d99fb282bbd95e5549e831f198d506355"} Apr 24 16:39:11.755506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.755485 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" event={"ID":"c71dad60-6f43-425f-9755-72c272d5116b","Type":"ContainerStarted","Data":"fe1948e1efa8644eb0ff8d55b2a8ddc41c6199d9d8514f18498b433cd2fac9d2"} Apr 24 16:39:11.756617 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.756594 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" event={"ID":"03b3f80f-e0be-4a62-92c7-eba566136dc7","Type":"ContainerStarted","Data":"e4c20430e357472119681cb9eba8cdae5d7dc33cddbb11daed7dd56933abe09a"} Apr 24 16:39:11.757993 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.757967 2561 generic.go:358] "Generic (PLEG): container finished" podID="0cf50135-6483-41fd-a681-60b2bfc1a66d" containerID="644a72e88cd0950980ed39972a09fa1fc927f7718406d659729111ae98e8b816" exitCode=0 Apr 24 16:39:11.758077 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.758055 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerDied","Data":"644a72e88cd0950980ed39972a09fa1fc927f7718406d659729111ae98e8b816"} Apr 24 16:39:11.759573 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.759547 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5qxgh" event={"ID":"3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2","Type":"ContainerStarted","Data":"6f59f26cfbf2c66dd5e8893657c285ab586901617a9a3bb56d575d11ce6a31fb"} Apr 24 16:39:11.760683 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.760666 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wcmwg" event={"ID":"5461250a-930b-44a6-ba77-773586840e32","Type":"ContainerStarted","Data":"e2f73a6b694f354eb61d4319be5af8a4f9c82fc5e32f2dd3ff551c2540d1c117"} Apr 24 16:39:11.770386 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.770292 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9bw4h" podStartSLOduration=3.804261885 podStartE2EDuration="20.770270862s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:38:54.27336883 +0000 UTC m=+3.126239536" lastFinishedPulling="2026-04-24 16:39:11.239377814 +0000 UTC m=+20.092248513" observedRunningTime="2026-04-24 16:39:11.767549637 +0000 UTC m=+20.620420346" watchObservedRunningTime="2026-04-24 16:39:11.770270862 +0000 UTC m=+20.623141575" Apr 24 16:39:11.781788 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.781753 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hkhzt" podStartSLOduration=3.812034536 podStartE2EDuration="20.781741344s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:38:54.271240435 +0000 UTC m=+3.124111129" lastFinishedPulling="2026-04-24 16:39:11.240947237 +0000 UTC m=+20.093817937" observedRunningTime="2026-04-24 16:39:11.781548227 +0000 UTC m=+20.634418941" watchObservedRunningTime="2026-04-24 16:39:11.781741344 +0000 UTC m=+20.634612058" Apr 24 16:39:11.794154 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.794094 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wcmwg" podStartSLOduration=8.70337885 podStartE2EDuration="20.794084081s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:38:54.265892361 +0000 UTC m=+3.118763055" lastFinishedPulling="2026-04-24 16:39:06.356597589 +0000 UTC m=+15.209468286" observedRunningTime="2026-04-24 16:39:11.793811173 +0000 UTC m=+20.646681887" watchObservedRunningTime="2026-04-24 16:39:11.794084081 +0000 UTC m=+20.646954794" Apr 24 16:39:11.831085 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:11.831036 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5qxgh" podStartSLOduration=3.802579727 podStartE2EDuration="20.831021784s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:38:54.269764104 +0000 UTC m=+3.122634799" lastFinishedPulling="2026-04-24 16:39:11.298206153 +0000 UTC m=+20.151076856" observedRunningTime="2026-04-24 16:39:11.83085347 +0000 UTC m=+20.683724185" watchObservedRunningTime="2026-04-24 16:39:11.831021784 +0000 UTC m=+20.683892522" Apr 24 16:39:12.453232 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:12.453007 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:12.636618 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:12.636505 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:12.453229487Z","UUID":"585039ca-bfb0-40dc-9f58-8254333439d9","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:12.639228 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:12.639201 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:12.639228 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:12.639232 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:12.646220 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:12.646195 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:12.646332 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:12.646285 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:12.765820 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:12.765787 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"4aae22b497254e5cbb491a8b4e31d2bbf6561685f07c8459236c7850a4857538"} Apr 24 16:39:12.767165 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:12.767110 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5bm47" event={"ID":"df14d60b-71e6-490b-a1cb-a94cefccd438","Type":"ContainerStarted","Data":"08f5a5e67d1fac5ca27b5537a13801062af44680ca5be2ac1d8e0d4fa1c85769"} Apr 24 16:39:12.768807 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:12.768784 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" event={"ID":"03b3f80f-e0be-4a62-92c7-eba566136dc7","Type":"ContainerStarted","Data":"a0b823389545b9e91cdda00f05909f8bf7d375d42c2f102752cdccf7f6bb36fd"} Apr 24 16:39:13.649989 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:13.649908 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:13.650181 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:13.650036 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:13.772470 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:13.772415 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" event={"ID":"03b3f80f-e0be-4a62-92c7-eba566136dc7","Type":"ContainerStarted","Data":"66187eeca4f2ecc08980a3da5a9904c8687807c375551f792467bec522ba7f74"} Apr 24 16:39:13.788358 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:13.788312 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5bm47" podStartSLOduration=6.168112392 podStartE2EDuration="22.788297021s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:38:54.267379671 +0000 UTC m=+3.120250366" lastFinishedPulling="2026-04-24 16:39:10.887564301 +0000 UTC m=+19.740434995" observedRunningTime="2026-04-24 16:39:12.780959925 +0000 UTC m=+21.633830641" watchObservedRunningTime="2026-04-24 16:39:13.788297021 +0000 UTC m=+22.641167771" Apr 24 16:39:13.788528 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:13.788429 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vpkmz" podStartSLOduration=3.698171609 podStartE2EDuration="22.788425123s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:38:54.273877346 +0000 UTC m=+3.126748037" lastFinishedPulling="2026-04-24 16:39:13.364130852 +0000 UTC m=+22.217001551" observedRunningTime="2026-04-24 16:39:13.788101128 +0000 UTC m=+22.640971845" watchObservedRunningTime="2026-04-24 16:39:13.788425123 +0000 UTC m=+22.641295837" Apr 24 16:39:14.646639 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:14.646605 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:14.646826 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:14.646726 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:14.778797 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:14.778765 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"53c0b3a13f8f5ab754ecc88e1527a05a354e93ad5f360933a27c81173441e9eb"} Apr 24 16:39:15.098665 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:15.098632 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:39:15.099377 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:15.099342 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:39:15.650276 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:15.650191 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:15.650432 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:15.650326 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:15.782433 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:15.782405 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerStarted","Data":"12cb490c07789eb0441f9b165a82c3986e24abbfd6f576a9864ef475e54ccb74"} Apr 24 16:39:15.782951 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:15.782594 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:39:15.783099 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:15.783082 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9bw4h" Apr 24 16:39:16.646608 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.646406 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:16.646720 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:16.646643 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:16.787293 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.787258 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" event={"ID":"57df94e3-357d-4303-9788-f5e7a9d03ae9","Type":"ContainerStarted","Data":"500b4af40c74beb547079aa25064ae785658023b6e8cb8d5ba3f0216884ff9a8"} Apr 24 16:39:16.787715 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.787667 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:39:16.787715 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.787697 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:39:16.787715 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.787709 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:39:16.788885 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.788857 2561 generic.go:358] "Generic (PLEG): container finished" podID="0cf50135-6483-41fd-a681-60b2bfc1a66d" containerID="12cb490c07789eb0441f9b165a82c3986e24abbfd6f576a9864ef475e54ccb74" exitCode=0 Apr 24 16:39:16.788995 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.788933 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerDied","Data":"12cb490c07789eb0441f9b165a82c3986e24abbfd6f576a9864ef475e54ccb74"} Apr 24 16:39:16.802852 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.802835 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:39:16.802936 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.802891 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:39:16.818609 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:16.818575 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" podStartSLOduration=8.806261382 podStartE2EDuration="25.818562281s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:38:54.27479635 +0000 UTC m=+3.127667048" lastFinishedPulling="2026-04-24 16:39:11.287097239 +0000 UTC m=+20.139967947" observedRunningTime="2026-04-24 16:39:16.818151019 +0000 UTC m=+25.671021734" watchObservedRunningTime="2026-04-24 16:39:16.818562281 +0000 UTC m=+25.671432995" Apr 24 16:39:17.646694 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:17.646633 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:17.646806 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:17.646775 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:17.792702 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:17.792553 2561 generic.go:358] "Generic (PLEG): container finished" podID="0cf50135-6483-41fd-a681-60b2bfc1a66d" containerID="caf0133608aa38f2e238cdb66c68a34fc4ceefcd11a325618d6536cd9bcde7bf" exitCode=0 Apr 24 16:39:17.793093 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:17.792637 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerDied","Data":"caf0133608aa38f2e238cdb66c68a34fc4ceefcd11a325618d6536cd9bcde7bf"} Apr 24 16:39:17.902505 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:17.902436 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2p9kk"] Apr 24 16:39:17.902633 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:17.902534 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:17.902633 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:17.902610 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:17.905388 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:17.905365 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6c75x"] Apr 24 16:39:17.905523 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:17.905461 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:17.905573 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:17.905543 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:18.796399 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:18.796368 2561 generic.go:358] "Generic (PLEG): container finished" podID="0cf50135-6483-41fd-a681-60b2bfc1a66d" containerID="9693bf6f58e42da06297a2da08ea5a451101587ae27ff4536207c50d33cf8b46" exitCode=0 Apr 24 16:39:18.796830 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:18.796454 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerDied","Data":"9693bf6f58e42da06297a2da08ea5a451101587ae27ff4536207c50d33cf8b46"} Apr 24 16:39:19.647062 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:19.646980 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:19.647250 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:19.646980 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:19.647250 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:19.647107 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:19.647374 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:19.647276 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:21.647366 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:21.647330 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:21.647986 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:21.647424 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:21.647986 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:21.647477 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:21.647986 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:21.647556 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:23.647202 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:23.647166 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:23.647719 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:23.647207 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:23.647719 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:23.647315 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6c75x" podUID="7abff685-57e5-4da6-b220-dbc6c56835ed" Apr 24 16:39:23.647719 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:23.647458 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2p9kk" podUID="f32abd88-b479-49d4-b013-a952386d695c" Apr 24 16:39:24.015030 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.015000 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-44.ec2.internal" event="NodeReady" Apr 24 16:39:24.015224 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.015139 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:24.076959 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.076931 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5578d6657b-bmlx2"] Apr 24 16:39:24.106542 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.106505 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hs8hk"] Apr 24 16:39:24.106696 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.106671 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.109423 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.109395 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 16:39:24.109954 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.109931 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 16:39:24.110867 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.110848 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ff9nx\"" Apr 24 16:39:24.110867 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.110859 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 16:39:24.118417 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.118397 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 16:39:24.124578 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.124558 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5578d6657b-bmlx2"] Apr 24 16:39:24.124693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.124585 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hs8hk"] Apr 24 16:39:24.124693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.124678 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:24.127138 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.127074 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zcktm\"" Apr 24 16:39:24.127464 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.127429 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.127594 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.127574 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.127672 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.127651 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:24.164341 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.164306 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ncrkw"] Apr 24 16:39:24.177051 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.177028 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.179775 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.179755 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.179875 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.179795 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:24.179875 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.179795 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:24.179875 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.179820 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x6pvx\"" Apr 24 16:39:24.179994 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.179890 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.183951 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.183917 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ncrkw"] Apr 24 16:39:24.189051 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189030 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-trusted-ca\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.189160 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189058 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9cwd\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-kube-api-access-t9cwd\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.189160 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189091 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzfdk\" (UniqueName: \"kubernetes.io/projected/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-kube-api-access-dzfdk\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:24.189160 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189150 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-installation-pull-secrets\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.189372 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189172 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-bound-sa-token\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.189372 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189198 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-registry-certificates\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.189372 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189237 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.189372 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189363 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-image-registry-private-configuration\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.189526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189395 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:24.189526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.189442 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60b92d77-634a-4247-9238-75288e189203-ca-trust-extracted\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290129 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290047 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-registry-certificates\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290129 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290095 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290316 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290143 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-image-registry-private-configuration\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290316 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290167 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:24.290316 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290193 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60b92d77-634a-4247-9238-75288e189203-ca-trust-extracted\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290316 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.290299 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:24.290491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.290320 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5578d6657b-bmlx2: secret "image-registry-tls" not found Apr 24 16:39:24.290491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.290366 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls podName:60b92d77-634a-4247-9238-75288e189203 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.790352259 +0000 UTC m=+33.643222952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls") pod "image-registry-5578d6657b-bmlx2" (UID: "60b92d77-634a-4247-9238-75288e189203") : secret "image-registry-tls" not found Apr 24 16:39:24.290491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.290304 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:24.290491 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290401 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wq7f\" (UniqueName: \"kubernetes.io/projected/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-kube-api-access-9wq7f\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.290491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.290429 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert podName:d1d5372f-dc95-47d2-a6be-7d40b52c4c1d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.790416672 +0000 UTC m=+33.643287369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert") pod "ingress-canary-hs8hk" (UID: "d1d5372f-dc95-47d2-a6be-7d40b52c4c1d") : secret "canary-serving-cert" not found Apr 24 16:39:24.290690 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290518 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60b92d77-634a-4247-9238-75288e189203-ca-trust-extracted\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290690 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290518 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-trusted-ca\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290690 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290576 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9cwd\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-kube-api-access-t9cwd\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290690 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290608 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.290690 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290647 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzfdk\" (UniqueName: \"kubernetes.io/projected/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-kube-api-access-dzfdk\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:24.290690 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290668 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-tmp-dir\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.290923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290694 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-installation-pull-secrets\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290720 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-bound-sa-token\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290739 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-registry-certificates\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.290923 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.290748 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-config-volume\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.291393 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.291375 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-trusted-ca\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.294351 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.294237 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-image-registry-private-configuration\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.294413 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.294301 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-installation-pull-secrets\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.303033 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.303014 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9cwd\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-kube-api-access-t9cwd\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.304054 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.304040 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-bound-sa-token\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.306175 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.306156 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzfdk\" (UniqueName: \"kubernetes.io/projected/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-kube-api-access-dzfdk\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:24.391161 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.391129 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.391293 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.391178 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-tmp-dir\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.391293 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.391203 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-config-volume\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.391293 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.391273 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wq7f\" (UniqueName: \"kubernetes.io/projected/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-kube-api-access-9wq7f\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.391293 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.391277 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:24.391488 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.391346 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls podName:d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.891325876 +0000 UTC m=+33.744196592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls") pod "dns-default-ncrkw" (UID: "d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:24.391488 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.391454 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-tmp-dir\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.391697 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.391678 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-config-volume\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.410731 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.410703 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wq7f\" (UniqueName: \"kubernetes.io/projected/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-kube-api-access-9wq7f\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.534462 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.534439 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-l8hzw"] Apr 24 16:39:24.541565 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.541493 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.543769 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.543753 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vbkkw\"" Apr 24 16:39:24.693849 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.693820 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5c88c86-c895-48fe-8dd8-898499027e1f-tmp-dir\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.694178 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.694005 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5c88c86-c895-48fe-8dd8-898499027e1f-hosts-file\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.694178 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.694102 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbnl\" (UniqueName: \"kubernetes.io/projected/e5c88c86-c895-48fe-8dd8-898499027e1f-kube-api-access-nkbnl\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.794727 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.794656 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:24.794727 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.794715 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5c88c86-c895-48fe-8dd8-898499027e1f-tmp-dir\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.794874 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.794793 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5c88c86-c895-48fe-8dd8-898499027e1f-hosts-file\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.794874 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.794798 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:24.794874 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.794842 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:24.794874 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.794860 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert podName:d1d5372f-dc95-47d2-a6be-7d40b52c4c1d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.794843841 +0000 UTC m=+34.647714532 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert") pod "ingress-canary-hs8hk" (UID: "d1d5372f-dc95-47d2-a6be-7d40b52c4c1d") : secret "canary-serving-cert" not found Apr 24 16:39:24.795051 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.794905 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbnl\" (UniqueName: \"kubernetes.io/projected/e5c88c86-c895-48fe-8dd8-898499027e1f-kube-api-access-nkbnl\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.795051 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.794941 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:24.795051 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.794935 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5c88c86-c895-48fe-8dd8-898499027e1f-hosts-file\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.795051 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.794959 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5578d6657b-bmlx2: secret "image-registry-tls" not found Apr 24 16:39:24.795051 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.795008 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls podName:60b92d77-634a-4247-9238-75288e189203 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.794993512 +0000 UTC m=+34.647864220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls") pod "image-registry-5578d6657b-bmlx2" (UID: "60b92d77-634a-4247-9238-75288e189203") : secret "image-registry-tls" not found Apr 24 16:39:24.795228 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.795060 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5c88c86-c895-48fe-8dd8-898499027e1f-tmp-dir\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.810179 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.810155 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerStarted","Data":"afde4f4add3851408c0e8dc07c881428771303522f54992cd7ce00171f110ca9"} Apr 24 16:39:24.828654 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.828630 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbnl\" (UniqueName: \"kubernetes.io/projected/e5c88c86-c895-48fe-8dd8-898499027e1f-kube-api-access-nkbnl\") pod \"node-resolver-l8hzw\" (UID: \"e5c88c86-c895-48fe-8dd8-898499027e1f\") " pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.850255 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.850239 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l8hzw" Apr 24 16:39:24.857555 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:24.857534 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5c88c86_c895_48fe_8dd8_898499027e1f.slice/crio-b1bc47f005e790cbdbf63809f336b17586163897412e71291080feb83071fdb2 WatchSource:0}: Error finding container b1bc47f005e790cbdbf63809f336b17586163897412e71291080feb83071fdb2: Status 404 returned error can't find the container with id b1bc47f005e790cbdbf63809f336b17586163897412e71291080feb83071fdb2 Apr 24 16:39:24.895740 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:24.895705 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:24.895929 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.895908 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:24.896023 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:24.895974 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls podName:d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.895954398 +0000 UTC m=+34.748825097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls") pod "dns-default-ncrkw" (UID: "d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:25.297241 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.297208 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:25.297387 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.297332 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:25.297387 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.297380 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs podName:7abff685-57e5-4da6-b220-dbc6c56835ed nodeName:}" failed. No retries permitted until 2026-04-24 16:39:57.297368892 +0000 UTC m=+66.150239583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs") pod "network-metrics-daemon-6c75x" (UID: "7abff685-57e5-4da6-b220-dbc6c56835ed") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:25.397724 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.397692 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:25.397872 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.397848 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:25.397872 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.397869 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:25.397967 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.397879 2561 projected.go:194] Error preparing data for projected volume kube-api-access-zhj7k for pod openshift-network-diagnostics/network-check-target-2p9kk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:25.397967 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.397933 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k podName:f32abd88-b479-49d4-b013-a952386d695c nodeName:}" failed. No retries permitted until 2026-04-24 16:39:57.39791967 +0000 UTC m=+66.250790365 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zhj7k" (UniqueName: "kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k") pod "network-check-target-2p9kk" (UID: "f32abd88-b479-49d4-b013-a952386d695c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:25.408781 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.408756 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-whm4n"] Apr 24 16:39:25.431418 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.431396 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-whm4n"] Apr 24 16:39:25.431512 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.431495 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.434011 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.433992 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:39:25.498829 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.498808 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/608b6244-9d8e-4eea-86b2-64a5cf06e100-kubelet-config\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.498939 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.498844 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/608b6244-9d8e-4eea-86b2-64a5cf06e100-original-pull-secret\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.498939 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.498907 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/608b6244-9d8e-4eea-86b2-64a5cf06e100-dbus\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.599341 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.599287 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/608b6244-9d8e-4eea-86b2-64a5cf06e100-dbus\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.599341 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.599330 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/608b6244-9d8e-4eea-86b2-64a5cf06e100-kubelet-config\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.599476 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.599391 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/608b6244-9d8e-4eea-86b2-64a5cf06e100-kubelet-config\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.599476 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.599416 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/608b6244-9d8e-4eea-86b2-64a5cf06e100-original-pull-secret\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.599545 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.599501 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/608b6244-9d8e-4eea-86b2-64a5cf06e100-dbus\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.601890 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.601868 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/608b6244-9d8e-4eea-86b2-64a5cf06e100-original-pull-secret\") pod \"global-pull-secret-syncer-whm4n\" (UID: \"608b6244-9d8e-4eea-86b2-64a5cf06e100\") " pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.648934 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.648917 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:25.649031 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.648917 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:25.657922 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.657894 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fcvt5\"" Apr 24 16:39:25.658044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.657936 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:25.658044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.658001 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:25.658044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.658029 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:25.658419 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.658405 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j4znz\"" Apr 24 16:39:25.739677 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.739657 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-whm4n" Apr 24 16:39:25.801549 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.801520 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:25.801673 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.801558 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:25.801673 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.801655 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:25.801746 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.801684 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:25.801746 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.801702 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert podName:d1d5372f-dc95-47d2-a6be-7d40b52c4c1d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.801689616 +0000 UTC m=+36.654560307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert") pod "ingress-canary-hs8hk" (UID: "d1d5372f-dc95-47d2-a6be-7d40b52c4c1d") : secret "canary-serving-cert" not found Apr 24 16:39:25.801746 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.801703 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5578d6657b-bmlx2: secret "image-registry-tls" not found Apr 24 16:39:25.801851 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.801748 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls podName:60b92d77-634a-4247-9238-75288e189203 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.801733454 +0000 UTC m=+36.654604150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls") pod "image-registry-5578d6657b-bmlx2" (UID: "60b92d77-634a-4247-9238-75288e189203") : secret "image-registry-tls" not found Apr 24 16:39:25.816579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.816554 2561 generic.go:358] "Generic (PLEG): container finished" podID="0cf50135-6483-41fd-a681-60b2bfc1a66d" containerID="afde4f4add3851408c0e8dc07c881428771303522f54992cd7ce00171f110ca9" exitCode=0 Apr 24 16:39:25.816695 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.816642 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerDied","Data":"afde4f4add3851408c0e8dc07c881428771303522f54992cd7ce00171f110ca9"} Apr 24 16:39:25.818144 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.818104 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l8hzw" event={"ID":"e5c88c86-c895-48fe-8dd8-898499027e1f","Type":"ContainerStarted","Data":"1d30e596184af0c3e8192c2f673c3999bb06b328e2c9e87ce357288a6f047d60"} Apr 24 16:39:25.818271 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.818152 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l8hzw" event={"ID":"e5c88c86-c895-48fe-8dd8-898499027e1f","Type":"ContainerStarted","Data":"b1bc47f005e790cbdbf63809f336b17586163897412e71291080feb83071fdb2"} Apr 24 16:39:25.864168 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.864077 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l8hzw" podStartSLOduration=1.864059654 podStartE2EDuration="1.864059654s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:25.863770804 +0000 UTC m=+34.716641518" watchObservedRunningTime="2026-04-24 16:39:25.864059654 +0000 UTC m=+34.716930367" Apr 24 16:39:25.892007 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.891973 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-whm4n"] Apr 24 16:39:25.897243 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:25.897209 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod608b6244_9d8e_4eea_86b2_64a5cf06e100.slice/crio-aa112464b1bac09e4257ef03cc43096c34bffa5f6b9a46f5ea734c95e406d998 WatchSource:0}: Error finding container aa112464b1bac09e4257ef03cc43096c34bffa5f6b9a46f5ea734c95e406d998: Status 404 returned error can't find the container with id aa112464b1bac09e4257ef03cc43096c34bffa5f6b9a46f5ea734c95e406d998 Apr 24 16:39:25.902271 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:25.902199 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:25.902354 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.902337 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:25.902415 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:25.902392 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls podName:d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.902375399 +0000 UTC m=+36.755246096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls") pod "dns-default-ncrkw" (UID: "d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:26.821376 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:26.821330 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-whm4n" event={"ID":"608b6244-9d8e-4eea-86b2-64a5cf06e100","Type":"ContainerStarted","Data":"aa112464b1bac09e4257ef03cc43096c34bffa5f6b9a46f5ea734c95e406d998"} Apr 24 16:39:26.824225 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:26.824194 2561 generic.go:358] "Generic (PLEG): container finished" podID="0cf50135-6483-41fd-a681-60b2bfc1a66d" containerID="7b8c781e541b88ee9de0f22533d334c5d588185489dfd6a67c369868ae513f47" exitCode=0 Apr 24 16:39:26.824341 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:26.824236 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerDied","Data":"7b8c781e541b88ee9de0f22533d334c5d588185489dfd6a67c369868ae513f47"} Apr 24 16:39:27.820306 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:27.820209 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:27.820306 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:27.820261 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:27.820491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:27.820353 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:27.820491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:27.820360 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:27.820491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:27.820382 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5578d6657b-bmlx2: secret "image-registry-tls" not found Apr 24 16:39:27.820491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:27.820409 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert podName:d1d5372f-dc95-47d2-a6be-7d40b52c4c1d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.820395525 +0000 UTC m=+40.673266216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert") pod "ingress-canary-hs8hk" (UID: "d1d5372f-dc95-47d2-a6be-7d40b52c4c1d") : secret "canary-serving-cert" not found Apr 24 16:39:27.820491 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:27.820427 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls podName:60b92d77-634a-4247-9238-75288e189203 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.820415825 +0000 UTC m=+40.673286520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls") pod "image-registry-5578d6657b-bmlx2" (UID: "60b92d77-634a-4247-9238-75288e189203") : secret "image-registry-tls" not found Apr 24 16:39:27.829847 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:27.829823 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k7nht" event={"ID":"0cf50135-6483-41fd-a681-60b2bfc1a66d","Type":"ContainerStarted","Data":"7ff249bc4c34d8db58f2f97fa2839fc46f91f22a57ad185b64a1d6501fe69a40"} Apr 24 16:39:27.853550 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:27.853505 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k7nht" podStartSLOduration=6.593643922 podStartE2EDuration="36.853490834s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:38:54.272377556 +0000 UTC m=+3.125248260" lastFinishedPulling="2026-04-24 16:39:24.53222448 +0000 UTC m=+33.385095172" observedRunningTime="2026-04-24 16:39:27.852962252 +0000 UTC m=+36.705832965" watchObservedRunningTime="2026-04-24 16:39:27.853490834 +0000 UTC m=+36.706361548" Apr 24 16:39:27.921450 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:27.921398 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:27.921639 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:27.921540 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:27.921639 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:27.921623 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls podName:d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.921597815 +0000 UTC m=+40.774468511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls") pod "dns-default-ncrkw" (UID: "d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:29.834697 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:29.834669 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-whm4n" event={"ID":"608b6244-9d8e-4eea-86b2-64a5cf06e100","Type":"ContainerStarted","Data":"ba9c1845b4f4c7d3f482a10a61483ff395915c56a642e0c751ab3d9751f20617"} Apr 24 16:39:30.850512 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:30.850461 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-whm4n" podStartSLOduration=2.07718705 podStartE2EDuration="5.850431275s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:39:25.90047559 +0000 UTC m=+34.753346281" lastFinishedPulling="2026-04-24 16:39:29.673719815 +0000 UTC m=+38.526590506" observedRunningTime="2026-04-24 16:39:30.850106977 +0000 UTC m=+39.702977691" watchObservedRunningTime="2026-04-24 16:39:30.850431275 +0000 UTC m=+39.703301986" Apr 24 16:39:31.121035 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.120966 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh"] Apr 24 16:39:31.147855 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.147822 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh"] Apr 24 16:39:31.147974 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.147874 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:31.150403 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.150378 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 16:39:31.150403 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.150391 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 16:39:31.150403 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.150378 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gx867\"" Apr 24 16:39:31.249328 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.249296 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e7b94c4f-1b89-4a52-ba90-78436120bab7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:31.249485 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.249342 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:31.349876 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.349709 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e7b94c4f-1b89-4a52-ba90-78436120bab7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:31.350199 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.350162 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:31.350418 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.350394 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:31.350498 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.350487 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert podName:e7b94c4f-1b89-4a52-ba90-78436120bab7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.850465985 +0000 UTC m=+40.703336698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mr4bh" (UID: "e7b94c4f-1b89-4a52-ba90-78436120bab7") : secret "networking-console-plugin-cert" not found Apr 24 16:39:31.361997 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.361974 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e7b94c4f-1b89-4a52-ba90-78436120bab7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:31.854444 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.854414 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.854454 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.854476 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.854553 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.854575 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5578d6657b-bmlx2: secret "image-registry-tls" not found Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.854575 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.854556 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.854625 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls podName:60b92d77-634a-4247-9238-75288e189203 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.854603855 +0000 UTC m=+48.707474551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls") pod "image-registry-5578d6657b-bmlx2" (UID: "60b92d77-634a-4247-9238-75288e189203") : secret "image-registry-tls" not found Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.854638 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert podName:e7b94c4f-1b89-4a52-ba90-78436120bab7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:32.854632329 +0000 UTC m=+41.707503021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mr4bh" (UID: "e7b94c4f-1b89-4a52-ba90-78436120bab7") : secret "networking-console-plugin-cert" not found Apr 24 16:39:31.854798 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.854648 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert podName:d1d5372f-dc95-47d2-a6be-7d40b52c4c1d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.854642813 +0000 UTC m=+48.707513505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert") pod "ingress-canary-hs8hk" (UID: "d1d5372f-dc95-47d2-a6be-7d40b52c4c1d") : secret "canary-serving-cert" not found Apr 24 16:39:31.955494 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:31.955456 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:31.955662 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.955604 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:31.955702 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:31.955663 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls podName:d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.955649237 +0000 UTC m=+48.808519933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls") pod "dns-default-ncrkw" (UID: "d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:32.863042 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:32.863009 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:32.863419 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:32.863172 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:32.863419 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:32.863231 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert podName:e7b94c4f-1b89-4a52-ba90-78436120bab7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:34.86321697 +0000 UTC m=+43.716087661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mr4bh" (UID: "e7b94c4f-1b89-4a52-ba90-78436120bab7") : secret "networking-console-plugin-cert" not found Apr 24 16:39:33.340631 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.340603 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f"] Apr 24 16:39:33.375411 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.375385 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f"] Apr 24 16:39:33.375552 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.375496 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" Apr 24 16:39:33.377864 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.377837 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:33.378878 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.378855 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-8x2zx\"" Apr 24 16:39:33.378996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.378863 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 16:39:33.467058 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.467021 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jfj\" (UniqueName: \"kubernetes.io/projected/e7643e3d-a60c-4b47-af12-ca48b6e20569-kube-api-access-p2jfj\") pod \"migrator-74bb7799d9-6gd9f\" (UID: \"e7643e3d-a60c-4b47-af12-ca48b6e20569\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" Apr 24 16:39:33.567899 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.567867 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2jfj\" (UniqueName: \"kubernetes.io/projected/e7643e3d-a60c-4b47-af12-ca48b6e20569-kube-api-access-p2jfj\") pod \"migrator-74bb7799d9-6gd9f\" (UID: \"e7643e3d-a60c-4b47-af12-ca48b6e20569\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" Apr 24 16:39:33.577097 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.577071 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2jfj\" (UniqueName: \"kubernetes.io/projected/e7643e3d-a60c-4b47-af12-ca48b6e20569-kube-api-access-p2jfj\") pod \"migrator-74bb7799d9-6gd9f\" (UID: \"e7643e3d-a60c-4b47-af12-ca48b6e20569\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" Apr 24 16:39:33.684305 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.684280 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" Apr 24 16:39:33.822404 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.822371 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f"] Apr 24 16:39:33.828684 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:33.828627 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7643e3d_a60c_4b47_af12_ca48b6e20569.slice/crio-223acbcf823aa216db09aab96c387c7eb54e2037912e13282e0ea59ae76a251a WatchSource:0}: Error finding container 223acbcf823aa216db09aab96c387c7eb54e2037912e13282e0ea59ae76a251a: Status 404 returned error can't find the container with id 223acbcf823aa216db09aab96c387c7eb54e2037912e13282e0ea59ae76a251a Apr 24 16:39:33.842637 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:33.842610 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" event={"ID":"e7643e3d-a60c-4b47-af12-ca48b6e20569","Type":"ContainerStarted","Data":"223acbcf823aa216db09aab96c387c7eb54e2037912e13282e0ea59ae76a251a"} Apr 24 16:39:34.351377 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.351343 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2sf2l"] Apr 24 16:39:34.365261 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.365229 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2sf2l"] Apr 24 16:39:34.365408 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.365350 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.368015 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.367987 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 16:39:34.368143 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.368037 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 16:39:34.369233 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.369199 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 16:39:34.369354 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.369277 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-9kkrf\"" Apr 24 16:39:34.369354 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.369310 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 16:39:34.475193 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.475146 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljm26\" (UniqueName: \"kubernetes.io/projected/6127d1a2-9c62-4d77-ace7-56f39993f251-kube-api-access-ljm26\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.475330 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.475274 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6127d1a2-9c62-4d77-ace7-56f39993f251-signing-key\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.475330 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.475301 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6127d1a2-9c62-4d77-ace7-56f39993f251-signing-cabundle\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.575727 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.575693 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljm26\" (UniqueName: \"kubernetes.io/projected/6127d1a2-9c62-4d77-ace7-56f39993f251-kube-api-access-ljm26\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.575905 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.575787 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6127d1a2-9c62-4d77-ace7-56f39993f251-signing-key\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.575905 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.575805 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6127d1a2-9c62-4d77-ace7-56f39993f251-signing-cabundle\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.576498 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.576478 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6127d1a2-9c62-4d77-ace7-56f39993f251-signing-cabundle\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.578310 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.578287 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6127d1a2-9c62-4d77-ace7-56f39993f251-signing-key\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.584512 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.584484 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljm26\" (UniqueName: \"kubernetes.io/projected/6127d1a2-9c62-4d77-ace7-56f39993f251-kube-api-access-ljm26\") pod \"service-ca-865cb79987-2sf2l\" (UID: \"6127d1a2-9c62-4d77-ace7-56f39993f251\") " pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.676518 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.676457 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2sf2l" Apr 24 16:39:34.793612 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.793562 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2sf2l"] Apr 24 16:39:34.834970 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.834944 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l8hzw_e5c88c86-c895-48fe-8dd8-898499027e1f/dns-node-resolver/0.log" Apr 24 16:39:34.878236 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:34.878209 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:34.878385 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:34.878366 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:34.878449 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:34.878439 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert podName:e7b94c4f-1b89-4a52-ba90-78436120bab7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:38.878423576 +0000 UTC m=+47.731294268 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mr4bh" (UID: "e7b94c4f-1b89-4a52-ba90-78436120bab7") : secret "networking-console-plugin-cert" not found Apr 24 16:39:35.024433 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:35.024398 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6127d1a2_9c62_4d77_ace7_56f39993f251.slice/crio-c704ab35c5ca5f12450e43885b6812a113a651c82a4296653d3f93903f930142 WatchSource:0}: Error finding container c704ab35c5ca5f12450e43885b6812a113a651c82a4296653d3f93903f930142: Status 404 returned error can't find the container with id c704ab35c5ca5f12450e43885b6812a113a651c82a4296653d3f93903f930142 Apr 24 16:39:35.847753 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:35.847718 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2sf2l" event={"ID":"6127d1a2-9c62-4d77-ace7-56f39993f251","Type":"ContainerStarted","Data":"c704ab35c5ca5f12450e43885b6812a113a651c82a4296653d3f93903f930142"} Apr 24 16:39:35.849427 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:35.849390 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" event={"ID":"e7643e3d-a60c-4b47-af12-ca48b6e20569","Type":"ContainerStarted","Data":"a3bf0bb7fecb1dd0b795763a4689d4275ae66e54c03581f53ff74bddbd300872"} Apr 24 16:39:35.849427 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:35.849427 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" event={"ID":"e7643e3d-a60c-4b47-af12-ca48b6e20569","Type":"ContainerStarted","Data":"b232393d9ac49a419d7bec9ad93a0df80ae159cd34b8934265199997d343078c"} Apr 24 16:39:35.872318 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:35.872251 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6gd9f" podStartSLOduration=1.331411804 podStartE2EDuration="2.872236719s" podCreationTimestamp="2026-04-24 16:39:33 +0000 UTC" firstStartedPulling="2026-04-24 16:39:33.830839897 +0000 UTC m=+42.683710601" lastFinishedPulling="2026-04-24 16:39:35.371664817 +0000 UTC m=+44.224535516" observedRunningTime="2026-04-24 16:39:35.871657073 +0000 UTC m=+44.724527786" watchObservedRunningTime="2026-04-24 16:39:35.872236719 +0000 UTC m=+44.725107432" Apr 24 16:39:36.035907 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:36.035881 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wcmwg_5461250a-930b-44a6-ba77-773586840e32/node-ca/0.log" Apr 24 16:39:37.854087 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:37.854000 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2sf2l" event={"ID":"6127d1a2-9c62-4d77-ace7-56f39993f251","Type":"ContainerStarted","Data":"273a1ff97d38a2f722f0ad63ebbd4904ed0fa52a722f4403bc7753b7a497a475"} Apr 24 16:39:37.872477 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:37.872428 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-2sf2l" podStartSLOduration=1.37203923 podStartE2EDuration="3.872413857s" podCreationTimestamp="2026-04-24 16:39:34 +0000 UTC" firstStartedPulling="2026-04-24 16:39:35.02636073 +0000 UTC m=+43.879231440" lastFinishedPulling="2026-04-24 16:39:37.526735376 +0000 UTC m=+46.379606067" observedRunningTime="2026-04-24 16:39:37.871241199 +0000 UTC m=+46.724111935" watchObservedRunningTime="2026-04-24 16:39:37.872413857 +0000 UTC m=+46.725284570" Apr 24 16:39:38.909275 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:38.909235 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:38.909742 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:38.909390 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:38.909742 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:38.909456 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert podName:e7b94c4f-1b89-4a52-ba90-78436120bab7 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:46.909440294 +0000 UTC m=+55.762310989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mr4bh" (UID: "e7b94c4f-1b89-4a52-ba90-78436120bab7") : secret "networking-console-plugin-cert" not found Apr 24 16:39:39.919048 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:39.919012 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:39.919501 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:39.919066 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:39.919501 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:39.919173 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:39.919501 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:39.919191 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5578d6657b-bmlx2: secret "image-registry-tls" not found Apr 24 16:39:39.919501 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:39.919225 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:39.919501 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:39.919241 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls podName:60b92d77-634a-4247-9238-75288e189203 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:55.919226085 +0000 UTC m=+64.772096776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls") pod "image-registry-5578d6657b-bmlx2" (UID: "60b92d77-634a-4247-9238-75288e189203") : secret "image-registry-tls" not found Apr 24 16:39:39.919501 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:39.919280 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert podName:d1d5372f-dc95-47d2-a6be-7d40b52c4c1d nodeName:}" failed. No retries permitted until 2026-04-24 16:39:55.919265626 +0000 UTC m=+64.772136324 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert") pod "ingress-canary-hs8hk" (UID: "d1d5372f-dc95-47d2-a6be-7d40b52c4c1d") : secret "canary-serving-cert" not found Apr 24 16:39:40.020331 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:40.020304 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:40.020478 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:40.020390 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:40.020478 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:40.020432 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls podName:d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.020419972 +0000 UTC m=+64.873290673 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls") pod "dns-default-ncrkw" (UID: "d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:46.973610 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:46.973578 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:39:46.974195 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:46.973763 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:46.974195 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:46.973833 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert podName:e7b94c4f-1b89-4a52-ba90-78436120bab7 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:02.973814235 +0000 UTC m=+71.826684926 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-mr4bh" (UID: "e7b94c4f-1b89-4a52-ba90-78436120bab7") : secret "networking-console-plugin-cert" not found Apr 24 16:39:48.807194 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:48.807139 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rrwzc" Apr 24 16:39:55.525789 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.525756 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-pq8gb"] Apr 24 16:39:55.561671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.561564 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-pq8gb"] Apr 24 16:39:55.561671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.561594 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5578d6657b-bmlx2"] Apr 24 16:39:55.561896 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:39:55.561772 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" podUID="60b92d77-634a-4247-9238-75288e189203" Apr 24 16:39:55.561896 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.561802 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-pq8gb" Apr 24 16:39:55.565195 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.565173 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:39:55.565317 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.565210 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:39:55.566086 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.566071 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4kw2b\"" Apr 24 16:39:55.633712 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.633665 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzz8p\" (UniqueName: \"kubernetes.io/projected/9be54bef-8146-4560-aef7-387e885e8227-kube-api-access-nzz8p\") pod \"downloads-6bcc868b7-pq8gb\" (UID: \"9be54bef-8146-4560-aef7-387e885e8227\") " pod="openshift-console/downloads-6bcc868b7-pq8gb" Apr 24 16:39:55.635072 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.635048 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-l2q9c"] Apr 24 16:39:55.666731 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.666703 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.671237 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.671221 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:39:55.671578 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.671563 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:39:55.672035 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.672014 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r6qpw\"" Apr 24 16:39:55.672035 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.672023 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:39:55.672211 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.672049 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:39:55.677961 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.677943 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l2q9c"] Apr 24 16:39:55.725829 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.725806 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c787d785f-2vkbk"] Apr 24 16:39:55.734436 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.734415 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/adcfe200-f21b-4233-a47a-d3e925e32b96-crio-socket\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.734512 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.734453 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/adcfe200-f21b-4233-a47a-d3e925e32b96-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.734512 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.734471 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fjf9\" (UniqueName: \"kubernetes.io/projected/adcfe200-f21b-4233-a47a-d3e925e32b96-kube-api-access-6fjf9\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.734625 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.734516 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/adcfe200-f21b-4233-a47a-d3e925e32b96-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.734625 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.734603 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/adcfe200-f21b-4233-a47a-d3e925e32b96-data-volume\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.734724 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.734690 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzz8p\" (UniqueName: \"kubernetes.io/projected/9be54bef-8146-4560-aef7-387e885e8227-kube-api-access-nzz8p\") pod \"downloads-6bcc868b7-pq8gb\" (UID: \"9be54bef-8146-4560-aef7-387e885e8227\") " pod="openshift-console/downloads-6bcc868b7-pq8gb" Apr 24 16:39:55.751703 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.751684 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c787d785f-2vkbk"] Apr 24 16:39:55.751777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.751770 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.758219 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.758197 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzz8p\" (UniqueName: \"kubernetes.io/projected/9be54bef-8146-4560-aef7-387e885e8227-kube-api-access-nzz8p\") pod \"downloads-6bcc868b7-pq8gb\" (UID: \"9be54bef-8146-4560-aef7-387e885e8227\") " pod="openshift-console/downloads-6bcc868b7-pq8gb" Apr 24 16:39:55.835434 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835369 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/adcfe200-f21b-4233-a47a-d3e925e32b96-crio-socket\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.835434 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835403 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/adcfe200-f21b-4233-a47a-d3e925e32b96-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.835434 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835423 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-ca-trust-extracted\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.835637 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835459 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fjf9\" (UniqueName: \"kubernetes.io/projected/adcfe200-f21b-4233-a47a-d3e925e32b96-kube-api-access-6fjf9\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.835637 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835498 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-registry-tls\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.835637 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835512 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/adcfe200-f21b-4233-a47a-d3e925e32b96-crio-socket\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.835637 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835554 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-image-registry-private-configuration\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.835637 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835587 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-installation-pull-secrets\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.835637 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835611 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrzth\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-kube-api-access-vrzth\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.835637 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835636 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-bound-sa-token\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.835896 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835680 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/adcfe200-f21b-4233-a47a-d3e925e32b96-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.835896 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835740 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-registry-certificates\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.835896 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835775 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/adcfe200-f21b-4233-a47a-d3e925e32b96-data-volume\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.835896 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835829 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-trusted-ca\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.836085 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.835924 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/adcfe200-f21b-4233-a47a-d3e925e32b96-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.836167 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.836089 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/adcfe200-f21b-4233-a47a-d3e925e32b96-data-volume\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.837721 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.837706 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/adcfe200-f21b-4233-a47a-d3e925e32b96-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.845473 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.845452 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fjf9\" (UniqueName: \"kubernetes.io/projected/adcfe200-f21b-4233-a47a-d3e925e32b96-kube-api-access-6fjf9\") pod \"insights-runtime-extractor-l2q9c\" (UID: \"adcfe200-f21b-4233-a47a-d3e925e32b96\") " pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.873659 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.873635 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-pq8gb" Apr 24 16:39:55.888772 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.888747 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:55.893197 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.893183 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:55.936173 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936151 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-trusted-ca\") pod \"60b92d77-634a-4247-9238-75288e189203\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " Apr 24 16:39:55.936266 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936189 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-bound-sa-token\") pod \"60b92d77-634a-4247-9238-75288e189203\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " Apr 24 16:39:55.936266 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936229 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-installation-pull-secrets\") pod \"60b92d77-634a-4247-9238-75288e189203\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " Apr 24 16:39:55.936266 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936247 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9cwd\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-kube-api-access-t9cwd\") pod \"60b92d77-634a-4247-9238-75288e189203\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " Apr 24 16:39:55.936266 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936266 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60b92d77-634a-4247-9238-75288e189203-ca-trust-extracted\") pod \"60b92d77-634a-4247-9238-75288e189203\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " Apr 24 16:39:55.936398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936283 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-image-registry-private-configuration\") pod \"60b92d77-634a-4247-9238-75288e189203\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " Apr 24 16:39:55.936398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936308 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-registry-certificates\") pod \"60b92d77-634a-4247-9238-75288e189203\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " Apr 24 16:39:55.936398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936361 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:55.936398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936379 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-registry-tls\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.936398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936397 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:55.936536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936415 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-image-registry-private-configuration\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.936536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936437 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-installation-pull-secrets\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.936536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936452 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrzth\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-kube-api-access-vrzth\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.936536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936474 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-bound-sa-token\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.936536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936506 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-registry-certificates\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.936536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936532 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-trusted-ca\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.936702 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936565 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-ca-trust-extracted\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.936910 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.936894 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-ca-trust-extracted\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.938231 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.937883 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b92d77-634a-4247-9238-75288e189203-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "60b92d77-634a-4247-9238-75288e189203" (UID: "60b92d77-634a-4247-9238-75288e189203"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:39:55.938708 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.938660 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "60b92d77-634a-4247-9238-75288e189203" (UID: "60b92d77-634a-4247-9238-75288e189203"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:39:55.939093 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.939062 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "60b92d77-634a-4247-9238-75288e189203" (UID: "60b92d77-634a-4247-9238-75288e189203"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:39:55.939220 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.939195 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-kube-api-access-t9cwd" (OuterVolumeSpecName: "kube-api-access-t9cwd") pod "60b92d77-634a-4247-9238-75288e189203" (UID: "60b92d77-634a-4247-9238-75288e189203"). InnerVolumeSpecName "kube-api-access-t9cwd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:39:55.939929 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.939906 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"image-registry-5578d6657b-bmlx2\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:55.940127 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.940088 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "60b92d77-634a-4247-9238-75288e189203" (UID: "60b92d77-634a-4247-9238-75288e189203"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:39:55.940252 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.940233 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "60b92d77-634a-4247-9238-75288e189203" (UID: "60b92d77-634a-4247-9238-75288e189203"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:39:55.940524 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.940489 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1d5372f-dc95-47d2-a6be-7d40b52c4c1d-cert\") pod \"ingress-canary-hs8hk\" (UID: \"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d\") " pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:55.940557 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.940532 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "60b92d77-634a-4247-9238-75288e189203" (UID: "60b92d77-634a-4247-9238-75288e189203"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:39:55.950349 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.950270 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-registry-certificates\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.950468 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.950443 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-trusted-ca\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.951756 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.951735 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-registry-tls\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.951939 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.951920 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-bound-sa-token\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.952409 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.952390 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrzth\" (UniqueName: \"kubernetes.io/projected/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-kube-api-access-vrzth\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.957636 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.957598 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-image-registry-private-configuration\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.957805 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.957787 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d56aa02a-1a5d-4ac9-b72b-f121ecc374b3-installation-pull-secrets\") pod \"image-registry-6c787d785f-2vkbk\" (UID: \"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3\") " pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:55.975295 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.975277 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l2q9c" Apr 24 16:39:55.995911 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:55.995884 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-pq8gb"] Apr 24 16:39:55.998784 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:55.998755 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9be54bef_8146_4560_aef7_387e885e8227.slice/crio-9d3c82734b5ddc88bcba645bc622d7a16a4e2f7d2795be5043cfb9614353c928 WatchSource:0}: Error finding container 9d3c82734b5ddc88bcba645bc622d7a16a4e2f7d2795be5043cfb9614353c928: Status 404 returned error can't find the container with id 9d3c82734b5ddc88bcba645bc622d7a16a4e2f7d2795be5043cfb9614353c928 Apr 24 16:39:56.037544 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037514 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") pod \"60b92d77-634a-4247-9238-75288e189203\" (UID: \"60b92d77-634a-4247-9238-75288e189203\") " Apr 24 16:39:56.037686 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037637 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:56.037748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037696 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9cwd\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-kube-api-access-t9cwd\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:39:56.037748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037706 2561 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/60b92d77-634a-4247-9238-75288e189203-ca-trust-extracted\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:39:56.037748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037716 2561 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-image-registry-private-configuration\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:39:56.037748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037731 2561 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-registry-certificates\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:39:56.037748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037740 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60b92d77-634a-4247-9238-75288e189203-trusted-ca\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:39:56.037748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037748 2561 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-bound-sa-token\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:39:56.038076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.037756 2561 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/60b92d77-634a-4247-9238-75288e189203-installation-pull-secrets\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:39:56.039351 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.039326 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "60b92d77-634a-4247-9238-75288e189203" (UID: "60b92d77-634a-4247-9238-75288e189203"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:39:56.039710 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.039691 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b-metrics-tls\") pod \"dns-default-ncrkw\" (UID: \"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b\") " pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:56.066564 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.066541 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ff9nx\"" Apr 24 16:39:56.073737 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.073718 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:56.088856 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.088806 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l2q9c"] Apr 24 16:39:56.091696 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:56.091673 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadcfe200_f21b_4233_a47a_d3e925e32b96.slice/crio-cd61615a36a3616d9dcfd464b97e758a891e315339e95cadd97f2cd0b77237b5 WatchSource:0}: Error finding container cd61615a36a3616d9dcfd464b97e758a891e315339e95cadd97f2cd0b77237b5: Status 404 returned error can't find the container with id cd61615a36a3616d9dcfd464b97e758a891e315339e95cadd97f2cd0b77237b5 Apr 24 16:39:56.139128 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.139091 2561 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/60b92d77-634a-4247-9238-75288e189203-registry-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:39:56.196063 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.196039 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c787d785f-2vkbk"] Apr 24 16:39:56.198990 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:56.198960 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd56aa02a_1a5d_4ac9_b72b_f121ecc374b3.slice/crio-14b2a46ea5765f7100232c78d687eeb9d8b10462cb3693ed4c352f3348f94c4d WatchSource:0}: Error finding container 14b2a46ea5765f7100232c78d687eeb9d8b10462cb3693ed4c352f3348f94c4d: Status 404 returned error can't find the container with id 14b2a46ea5765f7100232c78d687eeb9d8b10462cb3693ed4c352f3348f94c4d Apr 24 16:39:56.237715 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.237693 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zcktm\"" Apr 24 16:39:56.245723 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.245703 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hs8hk" Apr 24 16:39:56.289318 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.288670 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x6pvx\"" Apr 24 16:39:56.296241 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.296214 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ncrkw" Apr 24 16:39:56.369002 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.368980 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hs8hk"] Apr 24 16:39:56.371249 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:56.371206 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d5372f_dc95_47d2_a6be_7d40b52c4c1d.slice/crio-10f718a06dde994fdc15f8beec98d53e1d4f23a8a9cb3f2c8d01aba9dfbebb92 WatchSource:0}: Error finding container 10f718a06dde994fdc15f8beec98d53e1d4f23a8a9cb3f2c8d01aba9dfbebb92: Status 404 returned error can't find the container with id 10f718a06dde994fdc15f8beec98d53e1d4f23a8a9cb3f2c8d01aba9dfbebb92 Apr 24 16:39:56.420517 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:56.420485 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f68bdd_e4ea_40ca_a7ac_12f3eeeebd4b.slice/crio-ace2f908205808ff0ac92f44a823019c4878116c87e468d412aedf600d570a24 WatchSource:0}: Error finding container ace2f908205808ff0ac92f44a823019c4878116c87e468d412aedf600d570a24: Status 404 returned error can't find the container with id ace2f908205808ff0ac92f44a823019c4878116c87e468d412aedf600d570a24 Apr 24 16:39:56.421602 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.421580 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ncrkw"] Apr 24 16:39:56.910009 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.909956 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2q9c" event={"ID":"adcfe200-f21b-4233-a47a-d3e925e32b96","Type":"ContainerStarted","Data":"3d8e990dd4d533d8c37bc0d512d1379a01961f41f2594dc6ecf65e6b3c3a7ba5"} Apr 24 16:39:56.910375 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.910016 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2q9c" event={"ID":"adcfe200-f21b-4233-a47a-d3e925e32b96","Type":"ContainerStarted","Data":"cd61615a36a3616d9dcfd464b97e758a891e315339e95cadd97f2cd0b77237b5"} Apr 24 16:39:56.910957 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.910936 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ncrkw" event={"ID":"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b","Type":"ContainerStarted","Data":"ace2f908205808ff0ac92f44a823019c4878116c87e468d412aedf600d570a24"} Apr 24 16:39:56.911796 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.911773 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-pq8gb" event={"ID":"9be54bef-8146-4560-aef7-387e885e8227","Type":"ContainerStarted","Data":"9d3c82734b5ddc88bcba645bc622d7a16a4e2f7d2795be5043cfb9614353c928"} Apr 24 16:39:56.912657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.912636 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hs8hk" event={"ID":"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d","Type":"ContainerStarted","Data":"10f718a06dde994fdc15f8beec98d53e1d4f23a8a9cb3f2c8d01aba9dfbebb92"} Apr 24 16:39:56.914163 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.914145 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5578d6657b-bmlx2" Apr 24 16:39:56.914270 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.914146 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" event={"ID":"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3","Type":"ContainerStarted","Data":"6121c7dfebedf4849d485038138a07cef117f0f421957a99b0c4f47699452741"} Apr 24 16:39:56.914345 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.914282 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" event={"ID":"d56aa02a-1a5d-4ac9-b72b-f121ecc374b3","Type":"ContainerStarted","Data":"14b2a46ea5765f7100232c78d687eeb9d8b10462cb3693ed4c352f3348f94c4d"} Apr 24 16:39:56.944318 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.944084 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" podStartSLOduration=1.944045804 podStartE2EDuration="1.944045804s" podCreationTimestamp="2026-04-24 16:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:56.943603693 +0000 UTC m=+65.796474408" watchObservedRunningTime="2026-04-24 16:39:56.944045804 +0000 UTC m=+65.796916518" Apr 24 16:39:56.977420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.977397 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5578d6657b-bmlx2"] Apr 24 16:39:56.983476 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:56.983451 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5578d6657b-bmlx2"] Apr 24 16:39:57.351542 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.351443 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:57.354043 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.354017 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:57.366171 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.366101 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7abff685-57e5-4da6-b220-dbc6c56835ed-metrics-certs\") pod \"network-metrics-daemon-6c75x\" (UID: \"7abff685-57e5-4da6-b220-dbc6c56835ed\") " pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:57.452487 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.452457 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:57.454958 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.454915 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:57.465132 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.465100 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j4znz\"" Apr 24 16:39:57.465523 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.465412 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:57.472565 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.472543 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6c75x" Apr 24 16:39:57.477327 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.477280 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhj7k\" (UniqueName: \"kubernetes.io/projected/f32abd88-b479-49d4-b013-a952386d695c-kube-api-access-zhj7k\") pod \"network-check-target-2p9kk\" (UID: \"f32abd88-b479-49d4-b013-a952386d695c\") " pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:57.627745 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.627707 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6c75x"] Apr 24 16:39:57.632711 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:57.632681 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7abff685_57e5_4da6_b220_dbc6c56835ed.slice/crio-3e04ee44a6a27585a0a6bb719e613f137dec314afc8bb00aa2dd5c0829d1a9ba WatchSource:0}: Error finding container 3e04ee44a6a27585a0a6bb719e613f137dec314afc8bb00aa2dd5c0829d1a9ba: Status 404 returned error can't find the container with id 3e04ee44a6a27585a0a6bb719e613f137dec314afc8bb00aa2dd5c0829d1a9ba Apr 24 16:39:57.651793 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.651601 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b92d77-634a-4247-9238-75288e189203" path="/var/lib/kubelet/pods/60b92d77-634a-4247-9238-75288e189203/volumes" Apr 24 16:39:57.760310 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.760085 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fcvt5\"" Apr 24 16:39:57.768701 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.768454 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:39:57.922702 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.922628 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2p9kk"] Apr 24 16:39:57.923599 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.923540 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2q9c" event={"ID":"adcfe200-f21b-4233-a47a-d3e925e32b96","Type":"ContainerStarted","Data":"92996e7b13536b16875e7255ad582a1f11125a00d9916d4246a6938a502fa625"} Apr 24 16:39:57.925438 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.925375 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6c75x" event={"ID":"7abff685-57e5-4da6-b220-dbc6c56835ed","Type":"ContainerStarted","Data":"3e04ee44a6a27585a0a6bb719e613f137dec314afc8bb00aa2dd5c0829d1a9ba"} Apr 24 16:39:57.925438 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:57.925402 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:39:58.482932 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:39:58.482902 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf32abd88_b479_49d4_b013_a952386d695c.slice/crio-c351672f543489d19535b6db3a0485f6943f5b94005791769917419235e7920f WatchSource:0}: Error finding container c351672f543489d19535b6db3a0485f6943f5b94005791769917419235e7920f: Status 404 returned error can't find the container with id c351672f543489d19535b6db3a0485f6943f5b94005791769917419235e7920f Apr 24 16:39:58.929253 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:58.929140 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2p9kk" event={"ID":"f32abd88-b479-49d4-b013-a952386d695c","Type":"ContainerStarted","Data":"c351672f543489d19535b6db3a0485f6943f5b94005791769917419235e7920f"} Apr 24 16:39:59.936204 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:39:59.935462 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hs8hk" event={"ID":"d1d5372f-dc95-47d2-a6be-7d40b52c4c1d","Type":"ContainerStarted","Data":"c5290313a3586c6e42dafba4915fd008e65b2b450a54e08f5ffb0e748083c34a"} Apr 24 16:40:00.941284 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.941249 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ncrkw" event={"ID":"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b","Type":"ContainerStarted","Data":"51bbfc854329269ef341ca76ca64dd512f289221bf40351e0ed6c6eae86765d9"} Apr 24 16:40:00.941692 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.941293 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ncrkw" event={"ID":"d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b","Type":"ContainerStarted","Data":"0bd30332fe395477a31d2803c44f5ed7f43ae838f23b2f808ecbbb9ccf1540fe"} Apr 24 16:40:00.941692 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.941438 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ncrkw" Apr 24 16:40:00.943629 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.943581 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2q9c" event={"ID":"adcfe200-f21b-4233-a47a-d3e925e32b96","Type":"ContainerStarted","Data":"d3744210849405a998b2a529a3ef479bc6e54132fa72be7442a2e7998b199986"} Apr 24 16:40:00.945558 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.945532 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6c75x" event={"ID":"7abff685-57e5-4da6-b220-dbc6c56835ed","Type":"ContainerStarted","Data":"c3bb0e2b3fc2baaae740f676f3913678ac747b5439bf5e7a5bdc0ef0c6d332f2"} Apr 24 16:40:00.945677 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.945573 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6c75x" event={"ID":"7abff685-57e5-4da6-b220-dbc6c56835ed","Type":"ContainerStarted","Data":"e70bb7bf57333d3a98701fcbadc6142b12d52c10ee6d5e285a2efde05dbdcee0"} Apr 24 16:40:00.961029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.960512 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ncrkw" podStartSLOduration=33.577528302 podStartE2EDuration="36.96050114s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.422309742 +0000 UTC m=+65.275180434" lastFinishedPulling="2026-04-24 16:39:59.805282575 +0000 UTC m=+68.658153272" observedRunningTime="2026-04-24 16:40:00.959853925 +0000 UTC m=+69.812724662" watchObservedRunningTime="2026-04-24 16:40:00.96050114 +0000 UTC m=+69.813371853" Apr 24 16:40:00.961029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.960598 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hs8hk" podStartSLOduration=33.52952782 podStartE2EDuration="36.960594083s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.373774153 +0000 UTC m=+65.226644845" lastFinishedPulling="2026-04-24 16:39:59.804840413 +0000 UTC m=+68.657711108" observedRunningTime="2026-04-24 16:39:59.952931636 +0000 UTC m=+68.805802350" watchObservedRunningTime="2026-04-24 16:40:00.960594083 +0000 UTC m=+69.813464774" Apr 24 16:40:00.995873 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.995773 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6c75x" podStartSLOduration=67.76948036900001 podStartE2EDuration="1m9.995758754s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:39:57.635766087 +0000 UTC m=+66.488636784" lastFinishedPulling="2026-04-24 16:39:59.862044464 +0000 UTC m=+68.714915169" observedRunningTime="2026-04-24 16:40:00.994802452 +0000 UTC m=+69.847673168" watchObservedRunningTime="2026-04-24 16:40:00.995758754 +0000 UTC m=+69.848629465" Apr 24 16:40:00.996599 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:00.996361 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-l2q9c" podStartSLOduration=2.401358363 podStartE2EDuration="5.996348201s" podCreationTimestamp="2026-04-24 16:39:55 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.209852228 +0000 UTC m=+65.062722919" lastFinishedPulling="2026-04-24 16:39:59.804842055 +0000 UTC m=+68.657712757" observedRunningTime="2026-04-24 16:40:00.977600619 +0000 UTC m=+69.830471333" watchObservedRunningTime="2026-04-24 16:40:00.996348201 +0000 UTC m=+69.849218914" Apr 24 16:40:02.954155 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:02.954102 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2p9kk" event={"ID":"f32abd88-b479-49d4-b013-a952386d695c","Type":"ContainerStarted","Data":"4d5bfdbbfeffaa6b02bf47d6c83685ce038617ec69983d840ad4e3758161cad4"} Apr 24 16:40:02.954607 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:02.954306 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:40:02.971453 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:02.971395 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2p9kk" podStartSLOduration=68.10995043 podStartE2EDuration="1m11.971378218s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:39:58.484985862 +0000 UTC m=+67.337856554" lastFinishedPulling="2026-04-24 16:40:02.346413638 +0000 UTC m=+71.199284342" observedRunningTime="2026-04-24 16:40:02.969493202 +0000 UTC m=+71.822363916" watchObservedRunningTime="2026-04-24 16:40:02.971378218 +0000 UTC m=+71.824248931" Apr 24 16:40:03.002561 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:03.002524 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:40:03.005245 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:03.005222 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e7b94c4f-1b89-4a52-ba90-78436120bab7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-mr4bh\" (UID: \"e7b94c4f-1b89-4a52-ba90-78436120bab7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:40:03.259430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:03.259348 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gx867\"" Apr 24 16:40:03.267417 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:03.267391 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" Apr 24 16:40:03.430347 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:03.430293 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh"] Apr 24 16:40:03.434079 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:40:03.434050 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b94c4f_1b89_4a52_ba90_78436120bab7.slice/crio-2a0d5541d9f7a81b45af9a1add5c3a626c7a73f45307908431d5312cd762c913 WatchSource:0}: Error finding container 2a0d5541d9f7a81b45af9a1add5c3a626c7a73f45307908431d5312cd762c913: Status 404 returned error can't find the container with id 2a0d5541d9f7a81b45af9a1add5c3a626c7a73f45307908431d5312cd762c913 Apr 24 16:40:03.957725 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:03.957685 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" event={"ID":"e7b94c4f-1b89-4a52-ba90-78436120bab7","Type":"ContainerStarted","Data":"2a0d5541d9f7a81b45af9a1add5c3a626c7a73f45307908431d5312cd762c913"} Apr 24 16:40:04.962061 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:04.962026 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" event={"ID":"e7b94c4f-1b89-4a52-ba90-78436120bab7","Type":"ContainerStarted","Data":"4ba0cdec136ad1de4d794e541d855fa5d3c41e5efe102d6ed8bf79829a8538c0"} Apr 24 16:40:04.979493 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:04.979448 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-mr4bh" podStartSLOduration=32.822039481 podStartE2EDuration="33.979433389s" podCreationTimestamp="2026-04-24 16:39:31 +0000 UTC" firstStartedPulling="2026-04-24 16:40:03.436622908 +0000 UTC m=+72.289493599" lastFinishedPulling="2026-04-24 16:40:04.594016802 +0000 UTC m=+73.446887507" observedRunningTime="2026-04-24 16:40:04.9778653 +0000 UTC m=+73.830736011" watchObservedRunningTime="2026-04-24 16:40:04.979433389 +0000 UTC m=+73.832304100" Apr 24 16:40:10.951949 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:10.951822 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ncrkw" Apr 24 16:40:13.987158 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:13.987102 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-pq8gb" event={"ID":"9be54bef-8146-4560-aef7-387e885e8227","Type":"ContainerStarted","Data":"692e82648b854256b116e67d66db5636d891b8f616e73e39323b16e5b701d8fc"} Apr 24 16:40:13.987601 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:13.987344 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-pq8gb" Apr 24 16:40:13.999069 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:13.999044 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-pq8gb" Apr 24 16:40:14.009755 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.009608 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-pq8gb" podStartSLOduration=2.016089789 podStartE2EDuration="19.009592382s" podCreationTimestamp="2026-04-24 16:39:55 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.00065481 +0000 UTC m=+64.853525521" lastFinishedPulling="2026-04-24 16:40:12.994157408 +0000 UTC m=+81.847028114" observedRunningTime="2026-04-24 16:40:14.007414039 +0000 UTC m=+82.860284751" watchObservedRunningTime="2026-04-24 16:40:14.009592382 +0000 UTC m=+82.862463096" Apr 24 16:40:14.237798 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.237724 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4r2vk"] Apr 24 16:40:14.243817 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.243794 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.247269 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.247228 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:40:14.247487 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.247464 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:40:14.247561 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.247488 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-n9mn8\"" Apr 24 16:40:14.248517 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.247978 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:40:14.250272 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.250023 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:40:14.250272 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.250061 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:40:14.252403 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.252326 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:40:14.392930 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.392890 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-metrics-client-ca\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.393095 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.392942 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl9z9\" (UniqueName: \"kubernetes.io/projected/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-kube-api-access-sl9z9\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.393095 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.392969 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-root\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.393095 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.393014 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.393095 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.393045 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-wtmp\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.393095 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.393078 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.393391 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.393133 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-textfile\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.393391 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.393159 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-accelerators-collector-config\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.393391 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.393193 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-sys\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.493974 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-textfile\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.494044 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-accelerators-collector-config\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.494087 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-sys\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.494148 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-metrics-client-ca\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.494183 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl9z9\" (UniqueName: \"kubernetes.io/projected/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-kube-api-access-sl9z9\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.494207 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-root\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.494236 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.494265 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-wtmp\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.494291 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:40:14.494416 2561 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:40:14.494484 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls podName:0537ffed-f4ae-49a0-a7ef-a27e012a41ac nodeName:}" failed. No retries permitted until 2026-04-24 16:40:14.994462117 +0000 UTC m=+83.847332812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls") pod "node-exporter-4r2vk" (UID: "0537ffed-f4ae-49a0-a7ef-a27e012a41ac") : secret "node-exporter-tls" not found Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.495020 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-textfile\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.495549 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-accelerators-collector-config\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.495616 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-sys\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.496188 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.496023 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-metrics-client-ca\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.497623 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.497254 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-root\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.497754 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.497726 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-wtmp\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.500772 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.500722 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.506097 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.506051 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl9z9\" (UniqueName: \"kubernetes.io/projected/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-kube-api-access-sl9z9\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.998101 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:14.998063 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:14.998660 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:40:14.998243 2561 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 16:40:14.998660 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:40:14.998363 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls podName:0537ffed-f4ae-49a0-a7ef-a27e012a41ac nodeName:}" failed. No retries permitted until 2026-04-24 16:40:15.998339706 +0000 UTC m=+84.851210399 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls") pod "node-exporter-4r2vk" (UID: "0537ffed-f4ae-49a0-a7ef-a27e012a41ac") : secret "node-exporter-tls" not found Apr 24 16:40:16.006465 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:16.006434 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:16.008968 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:16.008943 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0537ffed-f4ae-49a0-a7ef-a27e012a41ac-node-exporter-tls\") pod \"node-exporter-4r2vk\" (UID: \"0537ffed-f4ae-49a0-a7ef-a27e012a41ac\") " pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:16.061245 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:16.061213 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4r2vk" Apr 24 16:40:16.078154 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:16.078088 2561 patch_prober.go:28] interesting pod/image-registry-6c787d785f-2vkbk container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:40:16.078289 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:16.078177 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" podUID="d56aa02a-1a5d-4ac9-b72b-f121ecc374b3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:40:16.081584 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:40:16.081556 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0537ffed_f4ae_49a0_a7ef_a27e012a41ac.slice/crio-4554ea64c9d2d7b7f4626ed033d2c5da97bb09c97da982d222f2e1e25cb544cb WatchSource:0}: Error finding container 4554ea64c9d2d7b7f4626ed033d2c5da97bb09c97da982d222f2e1e25cb544cb: Status 404 returned error can't find the container with id 4554ea64c9d2d7b7f4626ed033d2c5da97bb09c97da982d222f2e1e25cb544cb Apr 24 16:40:16.998053 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:16.998019 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4r2vk" event={"ID":"0537ffed-f4ae-49a0-a7ef-a27e012a41ac","Type":"ContainerStarted","Data":"4554ea64c9d2d7b7f4626ed033d2c5da97bb09c97da982d222f2e1e25cb544cb"} Apr 24 16:40:18.002815 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:18.002776 2561 generic.go:358] "Generic (PLEG): container finished" podID="0537ffed-f4ae-49a0-a7ef-a27e012a41ac" containerID="6142dd5d1c051c2586cbf14d47120003fa569ccdb040f1acbdcdb163ee8f9d93" exitCode=0 Apr 24 16:40:18.003272 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:18.002873 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4r2vk" event={"ID":"0537ffed-f4ae-49a0-a7ef-a27e012a41ac","Type":"ContainerDied","Data":"6142dd5d1c051c2586cbf14d47120003fa569ccdb040f1acbdcdb163ee8f9d93"} Apr 24 16:40:18.936130 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:18.935849 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c787d785f-2vkbk" Apr 24 16:40:19.009070 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.009027 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4r2vk" event={"ID":"0537ffed-f4ae-49a0-a7ef-a27e012a41ac","Type":"ContainerStarted","Data":"44129b18982619363c82355a54798e6bf7b460b286530b6ea38889535d6664bc"} Apr 24 16:40:19.009070 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.009077 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4r2vk" event={"ID":"0537ffed-f4ae-49a0-a7ef-a27e012a41ac","Type":"ContainerStarted","Data":"e97887089ea3f4f113191a53bd103ffb88a4d9a31bda1820ed3465052e800be9"} Apr 24 16:40:19.053518 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.053467 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4r2vk" podStartSLOduration=3.749721793 podStartE2EDuration="5.053450666s" podCreationTimestamp="2026-04-24 16:40:14 +0000 UTC" firstStartedPulling="2026-04-24 16:40:16.083696447 +0000 UTC m=+84.936567140" lastFinishedPulling="2026-04-24 16:40:17.387425316 +0000 UTC m=+86.240296013" observedRunningTime="2026-04-24 16:40:19.051138087 +0000 UTC m=+87.904008802" watchObservedRunningTime="2026-04-24 16:40:19.053450666 +0000 UTC m=+87.906321381" Apr 24 16:40:19.102795 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.102755 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm"] Apr 24 16:40:19.127582 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.127551 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm"] Apr 24 16:40:19.127739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.127683 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" Apr 24 16:40:19.130403 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.130378 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 16:40:19.130523 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.130384 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6z8cc\"" Apr 24 16:40:19.133181 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.133157 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67d788878c-jpsh4"] Apr 24 16:40:19.138274 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.138253 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.141832 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.141808 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:40:19.142094 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.142047 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:40:19.142094 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.142067 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lz9x2\"" Apr 24 16:40:19.142283 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.142151 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:40:19.142378 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.142357 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:40:19.142438 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.142400 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:40:19.150635 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.150462 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:40:19.151383 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.151354 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67d788878c-jpsh4"] Apr 24 16:40:19.233291 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.233210 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-serving-cert\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.233291 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.233255 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l7hp\" (UniqueName: \"kubernetes.io/projected/42b1c187-73c8-482d-a938-bf716047fdf2-kube-api-access-7l7hp\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.233499 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.233297 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-service-ca\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.233499 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.233363 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-oauth-serving-cert\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.233499 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.233402 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-console-config\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.233499 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.233438 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-trusted-ca-bundle\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.233499 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.233467 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cde6ad00-395f-4d91-befe-1d950c0d7182-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-94xpm\" (UID: \"cde6ad00-395f-4d91-befe-1d950c0d7182\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" Apr 24 16:40:19.233684 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.233538 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-oauth-config\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.334193 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.334164 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-service-ca\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.334365 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.334204 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-oauth-serving-cert\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.334365 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.334232 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-console-config\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.334365 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.334259 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-trusted-ca-bundle\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.334365 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.334289 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cde6ad00-395f-4d91-befe-1d950c0d7182-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-94xpm\" (UID: \"cde6ad00-395f-4d91-befe-1d950c0d7182\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" Apr 24 16:40:19.334540 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:40:19.334394 2561 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 16:40:19.334540 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.334418 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-oauth-config\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.334540 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:40:19.334460 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cde6ad00-395f-4d91-befe-1d950c0d7182-monitoring-plugin-cert podName:cde6ad00-395f-4d91-befe-1d950c0d7182 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:19.834441832 +0000 UTC m=+88.687312524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/cde6ad00-395f-4d91-befe-1d950c0d7182-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-94xpm" (UID: "cde6ad00-395f-4d91-befe-1d950c0d7182") : secret "monitoring-plugin-cert" not found Apr 24 16:40:19.334540 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.334488 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-serving-cert\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.334540 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.334511 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l7hp\" (UniqueName: \"kubernetes.io/projected/42b1c187-73c8-482d-a938-bf716047fdf2-kube-api-access-7l7hp\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.346964 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.346938 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-oauth-serving-cert\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.347149 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.347103 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-service-ca\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.347250 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.347033 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-console-config\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.347309 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.347246 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-trusted-ca-bundle\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.349069 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.349043 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-serving-cert\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.349225 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.349208 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-oauth-config\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.349458 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.349429 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l7hp\" (UniqueName: \"kubernetes.io/projected/42b1c187-73c8-482d-a938-bf716047fdf2-kube-api-access-7l7hp\") pod \"console-67d788878c-jpsh4\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.448664 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.448628 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:19.582142 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.582057 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67d788878c-jpsh4"] Apr 24 16:40:19.585314 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:40:19.585281 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b1c187_73c8_482d_a938_bf716047fdf2.slice/crio-c76d1d5d8629eca13f81c03897233ded77ccbacf236e61dfea8235ecd7684805 WatchSource:0}: Error finding container c76d1d5d8629eca13f81c03897233ded77ccbacf236e61dfea8235ecd7684805: Status 404 returned error can't find the container with id c76d1d5d8629eca13f81c03897233ded77ccbacf236e61dfea8235ecd7684805 Apr 24 16:40:19.838010 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.837893 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cde6ad00-395f-4d91-befe-1d950c0d7182-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-94xpm\" (UID: \"cde6ad00-395f-4d91-befe-1d950c0d7182\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" Apr 24 16:40:19.840575 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:19.840550 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/cde6ad00-395f-4d91-befe-1d950c0d7182-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-94xpm\" (UID: \"cde6ad00-395f-4d91-befe-1d950c0d7182\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" Apr 24 16:40:20.013750 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:20.013707 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d788878c-jpsh4" event={"ID":"42b1c187-73c8-482d-a938-bf716047fdf2","Type":"ContainerStarted","Data":"c76d1d5d8629eca13f81c03897233ded77ccbacf236e61dfea8235ecd7684805"} Apr 24 16:40:20.038453 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:20.038421 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" Apr 24 16:40:20.168315 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:20.168286 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm"] Apr 24 16:40:20.171431 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:40:20.171401 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcde6ad00_395f_4d91_befe_1d950c0d7182.slice/crio-51773a7c2390f32f8ef566021e594931144acf641cb62fba74886d8f8f8434c3 WatchSource:0}: Error finding container 51773a7c2390f32f8ef566021e594931144acf641cb62fba74886d8f8f8434c3: Status 404 returned error can't find the container with id 51773a7c2390f32f8ef566021e594931144acf641cb62fba74886d8f8f8434c3 Apr 24 16:40:21.021044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:21.021001 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" event={"ID":"cde6ad00-395f-4d91-befe-1d950c0d7182","Type":"ContainerStarted","Data":"51773a7c2390f32f8ef566021e594931144acf641cb62fba74886d8f8f8434c3"} Apr 24 16:40:24.037665 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:24.037617 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" event={"ID":"cde6ad00-395f-4d91-befe-1d950c0d7182","Type":"ContainerStarted","Data":"fb69dca6648e5d13380d81ab802dc3ea07343983e19ed8ea83df3e6330e2ae32"} Apr 24 16:40:24.038175 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:24.037770 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" Apr 24 16:40:24.039086 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:24.039062 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d788878c-jpsh4" event={"ID":"42b1c187-73c8-482d-a938-bf716047fdf2","Type":"ContainerStarted","Data":"dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a"} Apr 24 16:40:24.042805 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:24.042777 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" Apr 24 16:40:24.053105 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:24.053059 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-94xpm" podStartSLOduration=1.738415951 podStartE2EDuration="5.053043811s" podCreationTimestamp="2026-04-24 16:40:19 +0000 UTC" firstStartedPulling="2026-04-24 16:40:20.173520241 +0000 UTC m=+89.026390932" lastFinishedPulling="2026-04-24 16:40:23.488148093 +0000 UTC m=+92.341018792" observedRunningTime="2026-04-24 16:40:24.051895247 +0000 UTC m=+92.904765959" watchObservedRunningTime="2026-04-24 16:40:24.053043811 +0000 UTC m=+92.905914525" Apr 24 16:40:24.067753 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:24.067697 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67d788878c-jpsh4" podStartSLOduration=1.158623337 podStartE2EDuration="5.067687286s" podCreationTimestamp="2026-04-24 16:40:19 +0000 UTC" firstStartedPulling="2026-04-24 16:40:19.587671175 +0000 UTC m=+88.440541867" lastFinishedPulling="2026-04-24 16:40:23.496735123 +0000 UTC m=+92.349605816" observedRunningTime="2026-04-24 16:40:24.067658073 +0000 UTC m=+92.920528787" watchObservedRunningTime="2026-04-24 16:40:24.067687286 +0000 UTC m=+92.920557997" Apr 24 16:40:29.449287 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:29.449257 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:29.449766 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:29.449518 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:29.453575 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:29.453548 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:30.057919 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:30.057894 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:40:33.960873 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:40:33.960843 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2p9kk" Apr 24 16:41:11.055587 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.055546 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bcfb44746-wdp94"] Apr 24 16:41:11.058588 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.058564 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.070263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.070235 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bcfb44746-wdp94"] Apr 24 16:41:11.097371 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.097337 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-oauth-config\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.097524 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.097384 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-config\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.097524 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.097416 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-service-ca\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.097524 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.097450 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-trusted-ca-bundle\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.097524 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.097476 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-oauth-serving-cert\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.097524 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.097510 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pklb\" (UniqueName: \"kubernetes.io/projected/fe09c3ea-c187-4fe4-a682-7d969af71ba6-kube-api-access-9pklb\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.097713 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.097544 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-serving-cert\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.198300 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.198270 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-oauth-serving-cert\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.198300 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.198305 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pklb\" (UniqueName: \"kubernetes.io/projected/fe09c3ea-c187-4fe4-a682-7d969af71ba6-kube-api-access-9pklb\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.198527 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.198331 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-serving-cert\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.198527 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.198359 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-oauth-config\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.198527 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.198383 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-config\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.198527 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.198405 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-service-ca\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.198527 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.198436 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-trusted-ca-bundle\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.199141 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.199093 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-service-ca\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.199263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.199139 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-oauth-serving-cert\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.199263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.199247 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-config\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.199433 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.199416 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-trusted-ca-bundle\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.200972 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.200949 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-serving-cert\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.201066 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.200981 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-oauth-config\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.208768 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.208744 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pklb\" (UniqueName: \"kubernetes.io/projected/fe09c3ea-c187-4fe4-a682-7d969af71ba6-kube-api-access-9pklb\") pod \"console-6bcfb44746-wdp94\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.366937 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.366858 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:11.482430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:11.482407 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bcfb44746-wdp94"] Apr 24 16:41:11.484760 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:41:11.484738 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe09c3ea_c187_4fe4_a682_7d969af71ba6.slice/crio-6b10c970396e36bc550b38e3c378c2dcd3a831b10ab2667b554e23695c2433c6 WatchSource:0}: Error finding container 6b10c970396e36bc550b38e3c378c2dcd3a831b10ab2667b554e23695c2433c6: Status 404 returned error can't find the container with id 6b10c970396e36bc550b38e3c378c2dcd3a831b10ab2667b554e23695c2433c6 Apr 24 16:41:12.162635 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:12.162598 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bcfb44746-wdp94" event={"ID":"fe09c3ea-c187-4fe4-a682-7d969af71ba6","Type":"ContainerStarted","Data":"90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef"} Apr 24 16:41:12.162635 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:12.162636 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bcfb44746-wdp94" event={"ID":"fe09c3ea-c187-4fe4-a682-7d969af71ba6","Type":"ContainerStarted","Data":"6b10c970396e36bc550b38e3c378c2dcd3a831b10ab2667b554e23695c2433c6"} Apr 24 16:41:12.179694 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:12.179652 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bcfb44746-wdp94" podStartSLOduration=1.1796384930000001 podStartE2EDuration="1.179638493s" podCreationTimestamp="2026-04-24 16:41:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:12.17819629 +0000 UTC m=+141.031067004" watchObservedRunningTime="2026-04-24 16:41:12.179638493 +0000 UTC m=+141.032509205" Apr 24 16:41:21.367077 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:21.367045 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:21.367077 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:21.367081 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:21.371879 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:21.371858 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:22.193848 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.193819 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:22.242003 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.241970 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67d788878c-jpsh4"] Apr 24 16:41:22.313665 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.313631 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7fdc7876ff-nqcg8"] Apr 24 16:41:22.318191 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.318174 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.327093 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.327068 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fdc7876ff-nqcg8"] Apr 24 16:41:22.384614 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.384579 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xv5w\" (UniqueName: \"kubernetes.io/projected/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-kube-api-access-4xv5w\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.384996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.384622 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-serving-cert\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.384996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.384641 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-service-ca\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.384996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.384657 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-trusted-ca-bundle\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.384996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.384713 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-oauth-serving-cert\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.384996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.384755 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-oauth-config\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.384996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.384783 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-config\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486025 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.485930 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-config\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486025 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.485993 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xv5w\" (UniqueName: \"kubernetes.io/projected/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-kube-api-access-4xv5w\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486292 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.486032 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-serving-cert\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486292 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.486056 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-service-ca\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486292 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.486086 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-trusted-ca-bundle\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486292 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.486112 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-oauth-serving-cert\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486292 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.486169 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-oauth-config\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486676 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.486648 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-config\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486774 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.486749 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-oauth-serving-cert\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.486838 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.486749 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-service-ca\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.487315 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.487291 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-trusted-ca-bundle\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.488523 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.488493 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-serving-cert\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.488626 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.488570 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-oauth-config\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.495291 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.495272 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xv5w\" (UniqueName: \"kubernetes.io/projected/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-kube-api-access-4xv5w\") pod \"console-7fdc7876ff-nqcg8\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.627750 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.627714 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:22.744570 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:22.744412 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fdc7876ff-nqcg8"] Apr 24 16:41:22.747191 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:41:22.747162 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0e08a6_353b_4ea7_a22c_12a3a3f80932.slice/crio-0f1f706ed80bf87cfa0091a562ea02a323ef844267e281232abeb555e4d8bb38 WatchSource:0}: Error finding container 0f1f706ed80bf87cfa0091a562ea02a323ef844267e281232abeb555e4d8bb38: Status 404 returned error can't find the container with id 0f1f706ed80bf87cfa0091a562ea02a323ef844267e281232abeb555e4d8bb38 Apr 24 16:41:23.194222 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:23.194183 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fdc7876ff-nqcg8" event={"ID":"9d0e08a6-353b-4ea7-a22c-12a3a3f80932","Type":"ContainerStarted","Data":"c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68"} Apr 24 16:41:23.194222 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:23.194227 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fdc7876ff-nqcg8" event={"ID":"9d0e08a6-353b-4ea7-a22c-12a3a3f80932","Type":"ContainerStarted","Data":"0f1f706ed80bf87cfa0091a562ea02a323ef844267e281232abeb555e4d8bb38"} Apr 24 16:41:23.212335 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:23.212267 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7fdc7876ff-nqcg8" podStartSLOduration=1.212247551 podStartE2EDuration="1.212247551s" podCreationTimestamp="2026-04-24 16:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:23.211101822 +0000 UTC m=+152.063972535" watchObservedRunningTime="2026-04-24 16:41:23.212247551 +0000 UTC m=+152.065118264" Apr 24 16:41:32.628258 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:32.628213 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:32.628258 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:32.628259 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:32.632959 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:32.632940 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:33.223358 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:33.223333 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:41:33.268169 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:33.268140 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bcfb44746-wdp94"] Apr 24 16:41:47.260814 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.260705 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67d788878c-jpsh4" podUID="42b1c187-73c8-482d-a938-bf716047fdf2" containerName="console" containerID="cri-o://dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a" gracePeriod=15 Apr 24 16:41:47.521358 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.521336 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67d788878c-jpsh4_42b1c187-73c8-482d-a938-bf716047fdf2/console/0.log" Apr 24 16:41:47.521479 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.521396 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:41:47.675795 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.675767 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-console-config\") pod \"42b1c187-73c8-482d-a938-bf716047fdf2\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " Apr 24 16:41:47.675966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.675810 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-service-ca\") pod \"42b1c187-73c8-482d-a938-bf716047fdf2\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " Apr 24 16:41:47.675966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.675848 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-oauth-config\") pod \"42b1c187-73c8-482d-a938-bf716047fdf2\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " Apr 24 16:41:47.675966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.675884 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-oauth-serving-cert\") pod \"42b1c187-73c8-482d-a938-bf716047fdf2\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " Apr 24 16:41:47.675966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.675918 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-trusted-ca-bundle\") pod \"42b1c187-73c8-482d-a938-bf716047fdf2\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " Apr 24 16:41:47.675966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.675949 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l7hp\" (UniqueName: \"kubernetes.io/projected/42b1c187-73c8-482d-a938-bf716047fdf2-kube-api-access-7l7hp\") pod \"42b1c187-73c8-482d-a938-bf716047fdf2\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " Apr 24 16:41:47.676229 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.675974 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-serving-cert\") pod \"42b1c187-73c8-482d-a938-bf716047fdf2\" (UID: \"42b1c187-73c8-482d-a938-bf716047fdf2\") " Apr 24 16:41:47.676336 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.676251 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-console-config" (OuterVolumeSpecName: "console-config") pod "42b1c187-73c8-482d-a938-bf716047fdf2" (UID: "42b1c187-73c8-482d-a938-bf716047fdf2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:41:47.676411 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.676358 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-service-ca" (OuterVolumeSpecName: "service-ca") pod "42b1c187-73c8-482d-a938-bf716047fdf2" (UID: "42b1c187-73c8-482d-a938-bf716047fdf2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:41:47.676530 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.676502 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "42b1c187-73c8-482d-a938-bf716047fdf2" (UID: "42b1c187-73c8-482d-a938-bf716047fdf2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:41:47.676593 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.676539 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "42b1c187-73c8-482d-a938-bf716047fdf2" (UID: "42b1c187-73c8-482d-a938-bf716047fdf2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:41:47.678230 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.678200 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "42b1c187-73c8-482d-a938-bf716047fdf2" (UID: "42b1c187-73c8-482d-a938-bf716047fdf2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:41:47.678230 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.678222 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "42b1c187-73c8-482d-a938-bf716047fdf2" (UID: "42b1c187-73c8-482d-a938-bf716047fdf2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:41:47.678369 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.678277 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b1c187-73c8-482d-a938-bf716047fdf2-kube-api-access-7l7hp" (OuterVolumeSpecName: "kube-api-access-7l7hp") pod "42b1c187-73c8-482d-a938-bf716047fdf2" (UID: "42b1c187-73c8-482d-a938-bf716047fdf2"). InnerVolumeSpecName "kube-api-access-7l7hp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:41:47.777029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.776946 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-trusted-ca-bundle\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:47.777029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.776976 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7l7hp\" (UniqueName: \"kubernetes.io/projected/42b1c187-73c8-482d-a938-bf716047fdf2-kube-api-access-7l7hp\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:47.777029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.776987 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-serving-cert\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:47.777029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.776997 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-console-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:47.777029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.777007 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-service-ca\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:47.777029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.777015 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42b1c187-73c8-482d-a938-bf716047fdf2-console-oauth-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:47.777029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:47.777023 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42b1c187-73c8-482d-a938-bf716047fdf2-oauth-serving-cert\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:48.265603 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.265574 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67d788878c-jpsh4_42b1c187-73c8-482d-a938-bf716047fdf2/console/0.log" Apr 24 16:41:48.266003 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.265618 2561 generic.go:358] "Generic (PLEG): container finished" podID="42b1c187-73c8-482d-a938-bf716047fdf2" containerID="dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a" exitCode=2 Apr 24 16:41:48.266003 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.265681 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d788878c-jpsh4" Apr 24 16:41:48.266003 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.265704 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d788878c-jpsh4" event={"ID":"42b1c187-73c8-482d-a938-bf716047fdf2","Type":"ContainerDied","Data":"dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a"} Apr 24 16:41:48.266003 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.265746 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d788878c-jpsh4" event={"ID":"42b1c187-73c8-482d-a938-bf716047fdf2","Type":"ContainerDied","Data":"c76d1d5d8629eca13f81c03897233ded77ccbacf236e61dfea8235ecd7684805"} Apr 24 16:41:48.266003 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.265761 2561 scope.go:117] "RemoveContainer" containerID="dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a" Apr 24 16:41:48.273659 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.273630 2561 scope.go:117] "RemoveContainer" containerID="dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a" Apr 24 16:41:48.273929 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:41:48.273907 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a\": container with ID starting with dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a not found: ID does not exist" containerID="dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a" Apr 24 16:41:48.273995 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.273936 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a"} err="failed to get container status \"dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a\": rpc error: code = NotFound desc = could not find container \"dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a\": container with ID starting with dd9bdae1d0c5928e2b92e92de4ac25d35bff5098df6e7350d2770317e846cf2a not found: ID does not exist" Apr 24 16:41:48.287718 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.287693 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67d788878c-jpsh4"] Apr 24 16:41:48.291870 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:48.291848 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67d788878c-jpsh4"] Apr 24 16:41:49.650535 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:49.650501 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b1c187-73c8-482d-a938-bf716047fdf2" path="/var/lib/kubelet/pods/42b1c187-73c8-482d-a938-bf716047fdf2/volumes" Apr 24 16:41:58.286524 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.286458 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bcfb44746-wdp94" podUID="fe09c3ea-c187-4fe4-a682-7d969af71ba6" containerName="console" containerID="cri-o://90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef" gracePeriod=15 Apr 24 16:41:58.508576 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.508554 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bcfb44746-wdp94_fe09c3ea-c187-4fe4-a682-7d969af71ba6/console/0.log" Apr 24 16:41:58.508708 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.508628 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:58.651736 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.651635 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pklb\" (UniqueName: \"kubernetes.io/projected/fe09c3ea-c187-4fe4-a682-7d969af71ba6-kube-api-access-9pklb\") pod \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " Apr 24 16:41:58.651736 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.651694 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-oauth-serving-cert\") pod \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " Apr 24 16:41:58.651736 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.651718 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-config\") pod \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " Apr 24 16:41:58.651988 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.651751 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-service-ca\") pod \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " Apr 24 16:41:58.651988 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.651782 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-trusted-ca-bundle\") pod \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " Apr 24 16:41:58.651988 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.651851 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-serving-cert\") pod \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " Apr 24 16:41:58.651988 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.651883 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-oauth-config\") pod \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\" (UID: \"fe09c3ea-c187-4fe4-a682-7d969af71ba6\") " Apr 24 16:41:58.652245 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.652215 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-config" (OuterVolumeSpecName: "console-config") pod "fe09c3ea-c187-4fe4-a682-7d969af71ba6" (UID: "fe09c3ea-c187-4fe4-a682-7d969af71ba6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:41:58.652245 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.652228 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fe09c3ea-c187-4fe4-a682-7d969af71ba6" (UID: "fe09c3ea-c187-4fe4-a682-7d969af71ba6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:41:58.652361 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.652237 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-service-ca" (OuterVolumeSpecName: "service-ca") pod "fe09c3ea-c187-4fe4-a682-7d969af71ba6" (UID: "fe09c3ea-c187-4fe4-a682-7d969af71ba6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:41:58.652361 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.652253 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fe09c3ea-c187-4fe4-a682-7d969af71ba6" (UID: "fe09c3ea-c187-4fe4-a682-7d969af71ba6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:41:58.653831 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.653805 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe09c3ea-c187-4fe4-a682-7d969af71ba6-kube-api-access-9pklb" (OuterVolumeSpecName: "kube-api-access-9pklb") pod "fe09c3ea-c187-4fe4-a682-7d969af71ba6" (UID: "fe09c3ea-c187-4fe4-a682-7d969af71ba6"). InnerVolumeSpecName "kube-api-access-9pklb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:41:58.653949 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.653921 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fe09c3ea-c187-4fe4-a682-7d969af71ba6" (UID: "fe09c3ea-c187-4fe4-a682-7d969af71ba6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:41:58.654002 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.653932 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fe09c3ea-c187-4fe4-a682-7d969af71ba6" (UID: "fe09c3ea-c187-4fe4-a682-7d969af71ba6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:41:58.753152 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.753125 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-service-ca\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:58.753152 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.753148 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-trusted-ca-bundle\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:58.753152 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.753159 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-serving-cert\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:58.753337 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.753168 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-oauth-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:58.753337 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.753176 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9pklb\" (UniqueName: \"kubernetes.io/projected/fe09c3ea-c187-4fe4-a682-7d969af71ba6-kube-api-access-9pklb\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:58.753337 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.753186 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-oauth-serving-cert\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:58.753337 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:58.753195 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe09c3ea-c187-4fe4-a682-7d969af71ba6-console-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:41:59.295993 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.295966 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bcfb44746-wdp94_fe09c3ea-c187-4fe4-a682-7d969af71ba6/console/0.log" Apr 24 16:41:59.296430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.296004 2561 generic.go:358] "Generic (PLEG): container finished" podID="fe09c3ea-c187-4fe4-a682-7d969af71ba6" containerID="90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef" exitCode=2 Apr 24 16:41:59.296430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.296054 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bcfb44746-wdp94" event={"ID":"fe09c3ea-c187-4fe4-a682-7d969af71ba6","Type":"ContainerDied","Data":"90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef"} Apr 24 16:41:59.296430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.296070 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bcfb44746-wdp94" Apr 24 16:41:59.296430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.296084 2561 scope.go:117] "RemoveContainer" containerID="90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef" Apr 24 16:41:59.296430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.296075 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bcfb44746-wdp94" event={"ID":"fe09c3ea-c187-4fe4-a682-7d969af71ba6","Type":"ContainerDied","Data":"6b10c970396e36bc550b38e3c378c2dcd3a831b10ab2667b554e23695c2433c6"} Apr 24 16:41:59.303850 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.303828 2561 scope.go:117] "RemoveContainer" containerID="90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef" Apr 24 16:41:59.304097 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:41:59.304068 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef\": container with ID starting with 90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef not found: ID does not exist" containerID="90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef" Apr 24 16:41:59.304171 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.304105 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef"} err="failed to get container status \"90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef\": rpc error: code = NotFound desc = could not find container \"90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef\": container with ID starting with 90902d8517d9c3eb5f86aee62395e3fefb37b4451047a90afceb06b2ae8196ef not found: ID does not exist" Apr 24 16:41:59.316516 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.316495 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bcfb44746-wdp94"] Apr 24 16:41:59.324308 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.321174 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bcfb44746-wdp94"] Apr 24 16:41:59.650290 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:41:59.650207 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe09c3ea-c187-4fe4-a682-7d969af71ba6" path="/var/lib/kubelet/pods/fe09c3ea-c187-4fe4-a682-7d969af71ba6/volumes" Apr 24 16:42:20.884183 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.884148 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf"] Apr 24 16:42:20.884583 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.884424 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42b1c187-73c8-482d-a938-bf716047fdf2" containerName="console" Apr 24 16:42:20.884583 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.884436 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b1c187-73c8-482d-a938-bf716047fdf2" containerName="console" Apr 24 16:42:20.884583 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.884448 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe09c3ea-c187-4fe4-a682-7d969af71ba6" containerName="console" Apr 24 16:42:20.884583 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.884455 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe09c3ea-c187-4fe4-a682-7d969af71ba6" containerName="console" Apr 24 16:42:20.884583 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.884494 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="42b1c187-73c8-482d-a938-bf716047fdf2" containerName="console" Apr 24 16:42:20.884583 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.884507 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe09c3ea-c187-4fe4-a682-7d969af71ba6" containerName="console" Apr 24 16:42:20.887329 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.887312 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:20.891767 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.891748 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 16:42:20.891857 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.891770 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 16:42:20.892857 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.892840 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 16:42:20.892925 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.892844 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 16:42:20.898962 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:20.898940 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf"] Apr 24 16:42:21.007485 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.007460 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6p7g\" (UniqueName: \"kubernetes.io/projected/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-kube-api-access-h6p7g\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.007628 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.007498 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.007628 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.007519 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-tmp\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.107883 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.107838 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6p7g\" (UniqueName: \"kubernetes.io/projected/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-kube-api-access-h6p7g\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.108038 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.107916 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.108038 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.107951 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-tmp\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.108345 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.108325 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-tmp\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.110433 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.110415 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-klusterlet-config\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.116142 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.116104 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6p7g\" (UniqueName: \"kubernetes.io/projected/d66af5cd-0e94-4d06-bdce-1c5b63bb3c53-kube-api-access-h6p7g\") pod \"klusterlet-addon-workmgr-6fd5884f79-q85hf\" (UID: \"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.196655 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.196627 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:21.309052 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.309024 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf"] Apr 24 16:42:21.312221 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:42:21.312189 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd66af5cd_0e94_4d06_bdce_1c5b63bb3c53.slice/crio-4979ed3249ccfb6204d4289de78a16fc40dbadf76bd176ae813885cb1df26eb7 WatchSource:0}: Error finding container 4979ed3249ccfb6204d4289de78a16fc40dbadf76bd176ae813885cb1df26eb7: Status 404 returned error can't find the container with id 4979ed3249ccfb6204d4289de78a16fc40dbadf76bd176ae813885cb1df26eb7 Apr 24 16:42:21.354849 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:21.354820 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" event={"ID":"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53","Type":"ContainerStarted","Data":"4979ed3249ccfb6204d4289de78a16fc40dbadf76bd176ae813885cb1df26eb7"} Apr 24 16:42:25.370422 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:25.370336 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" event={"ID":"d66af5cd-0e94-4d06-bdce-1c5b63bb3c53","Type":"ContainerStarted","Data":"b90e335fac583bb824665eee0f18c48146998477aa8f696cd1ae0baa0d9deda5"} Apr 24 16:42:25.370778 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:25.370505 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:25.372079 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:25.372056 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" Apr 24 16:42:25.387625 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:25.387587 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6fd5884f79-q85hf" podStartSLOduration=1.61298727 podStartE2EDuration="5.387575183s" podCreationTimestamp="2026-04-24 16:42:20 +0000 UTC" firstStartedPulling="2026-04-24 16:42:21.314247331 +0000 UTC m=+210.167118023" lastFinishedPulling="2026-04-24 16:42:25.08883523 +0000 UTC m=+213.941705936" observedRunningTime="2026-04-24 16:42:25.385831673 +0000 UTC m=+214.238702386" watchObservedRunningTime="2026-04-24 16:42:25.387575183 +0000 UTC m=+214.240445897" Apr 24 16:42:38.204398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.204367 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j"] Apr 24 16:42:38.207539 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.207524 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:38.213607 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.213206 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 16:42:38.213607 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.213229 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 16:42:38.213607 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.213241 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-49sms\"" Apr 24 16:42:38.213607 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.213550 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 16:42:38.222005 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.221985 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j"] Apr 24 16:42:38.225111 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.225088 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a581c85d-8190-4829-9e60-9dc68ba5d70b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn46j\" (UID: \"a581c85d-8190-4829-9e60-9dc68ba5d70b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:38.225234 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.225158 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsmmf\" (UniqueName: \"kubernetes.io/projected/a581c85d-8190-4829-9e60-9dc68ba5d70b-kube-api-access-tsmmf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn46j\" (UID: \"a581c85d-8190-4829-9e60-9dc68ba5d70b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:38.325763 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.325736 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsmmf\" (UniqueName: \"kubernetes.io/projected/a581c85d-8190-4829-9e60-9dc68ba5d70b-kube-api-access-tsmmf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn46j\" (UID: \"a581c85d-8190-4829-9e60-9dc68ba5d70b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:38.325916 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.325792 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a581c85d-8190-4829-9e60-9dc68ba5d70b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn46j\" (UID: \"a581c85d-8190-4829-9e60-9dc68ba5d70b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:38.328041 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.328018 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/a581c85d-8190-4829-9e60-9dc68ba5d70b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn46j\" (UID: \"a581c85d-8190-4829-9e60-9dc68ba5d70b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:38.334366 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.334345 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsmmf\" (UniqueName: \"kubernetes.io/projected/a581c85d-8190-4829-9e60-9dc68ba5d70b-kube-api-access-tsmmf\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn46j\" (UID: \"a581c85d-8190-4829-9e60-9dc68ba5d70b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:38.517283 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.517198 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:38.639099 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:38.639061 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j"] Apr 24 16:42:38.641176 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:42:38.641150 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda581c85d_8190_4829_9e60_9dc68ba5d70b.slice/crio-b9429c0538013abc936d5203ce079c601e51288aae6ce03f0465d3e0fef5181e WatchSource:0}: Error finding container b9429c0538013abc936d5203ce079c601e51288aae6ce03f0465d3e0fef5181e: Status 404 returned error can't find the container with id b9429c0538013abc936d5203ce079c601e51288aae6ce03f0465d3e0fef5181e Apr 24 16:42:39.408508 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:39.408460 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" event={"ID":"a581c85d-8190-4829-9e60-9dc68ba5d70b","Type":"ContainerStarted","Data":"b9429c0538013abc936d5203ce079c601e51288aae6ce03f0465d3e0fef5181e"} Apr 24 16:42:42.420249 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.420161 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" event={"ID":"a581c85d-8190-4829-9e60-9dc68ba5d70b","Type":"ContainerStarted","Data":"3503608c1e6228957809022218f5f324c05b6317c1a64603b2914b8cbde37ae0"} Apr 24 16:42:42.420619 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.420374 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:42:42.440382 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.440335 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" podStartSLOduration=1.135223447 podStartE2EDuration="4.440321099s" podCreationTimestamp="2026-04-24 16:42:38 +0000 UTC" firstStartedPulling="2026-04-24 16:42:38.643168353 +0000 UTC m=+227.496039059" lastFinishedPulling="2026-04-24 16:42:41.948266004 +0000 UTC m=+230.801136711" observedRunningTime="2026-04-24 16:42:42.439168466 +0000 UTC m=+231.292039180" watchObservedRunningTime="2026-04-24 16:42:42.440321099 +0000 UTC m=+231.293191814" Apr 24 16:42:42.509390 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.509360 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-44kbh"] Apr 24 16:42:42.512416 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.512400 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.515343 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.515320 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 16:42:42.515463 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.515435 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 16:42:42.515463 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.515443 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-hx4rw\"" Apr 24 16:42:42.520365 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.520345 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-44kbh"] Apr 24 16:42:42.551048 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.551028 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7znbg\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-kube-api-access-7znbg\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.551175 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.551053 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.551175 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.551089 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-cabundle0\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.651525 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.651494 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-cabundle0\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.651699 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.651555 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7znbg\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-kube-api-access-7znbg\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.651699 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.651574 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.651699 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:42.651657 2561 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:42:42.651699 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:42.651668 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:42:42.651699 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:42.651676 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-44kbh: references non-existent secret key: ca.crt Apr 24 16:42:42.651962 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:42.651722 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates podName:7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e nodeName:}" failed. No retries permitted until 2026-04-24 16:42:43.151706615 +0000 UTC m=+232.004577307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates") pod "keda-operator-ffbb595cb-44kbh" (UID: "7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e") : references non-existent secret key: ca.crt Apr 24 16:42:42.652271 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.652249 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-cabundle0\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.659943 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.659919 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7znbg\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-kube-api-access-7znbg\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:42.821544 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.821511 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh"] Apr 24 16:42:42.824641 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.824622 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:42.827165 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.827146 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 16:42:42.835182 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.835160 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh"] Apr 24 16:42:42.852466 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.852442 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d351a555-501e-4735-80d0-4786c893c818-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:42.852551 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.852508 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:42.852598 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.852557 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n95sx\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-kube-api-access-n95sx\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:42.953410 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.953382 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n95sx\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-kube-api-access-n95sx\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:42.953528 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.953414 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d351a555-501e-4735-80d0-4786c893c818-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:42.953528 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.953473 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:42.953598 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:42.953566 2561 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:42:42.953598 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:42.953577 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:42:42.953598 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:42.953592 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh: references non-existent secret key: tls.crt Apr 24 16:42:42.953684 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:42.953634 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates podName:d351a555-501e-4735-80d0-4786c893c818 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:43.453619202 +0000 UTC m=+232.306489913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates") pod "keda-metrics-apiserver-7c9f485588-2d2lh" (UID: "d351a555-501e-4735-80d0-4786c893c818") : references non-existent secret key: tls.crt Apr 24 16:42:42.953843 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.953824 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d351a555-501e-4735-80d0-4786c893c818-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:42.968367 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:42.968341 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n95sx\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-kube-api-access-n95sx\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:43.059866 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.059813 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-hqvtr"] Apr 24 16:42:43.062941 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.062926 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:43.066538 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.066510 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 16:42:43.079765 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.079691 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-hqvtr"] Apr 24 16:42:43.154781 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.154753 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knn75\" (UniqueName: \"kubernetes.io/projected/db589ecd-a3e9-4309-9150-6d9b656f1ae0-kube-api-access-knn75\") pod \"keda-admission-cf49989db-hqvtr\" (UID: \"db589ecd-a3e9-4309-9150-6d9b656f1ae0\") " pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:43.154919 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.154798 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:43.154919 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.154825 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/db589ecd-a3e9-4309-9150-6d9b656f1ae0-certificates\") pod \"keda-admission-cf49989db-hqvtr\" (UID: \"db589ecd-a3e9-4309-9150-6d9b656f1ae0\") " pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:43.154989 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:43.154920 2561 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:42:43.154989 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:43.154937 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:42:43.154989 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:43.154946 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-44kbh: references non-existent secret key: ca.crt Apr 24 16:42:43.155084 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:43.154992 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates podName:7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e nodeName:}" failed. No retries permitted until 2026-04-24 16:42:44.154978535 +0000 UTC m=+233.007849227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates") pod "keda-operator-ffbb595cb-44kbh" (UID: "7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e") : references non-existent secret key: ca.crt Apr 24 16:42:43.255482 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.255452 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knn75\" (UniqueName: \"kubernetes.io/projected/db589ecd-a3e9-4309-9150-6d9b656f1ae0-kube-api-access-knn75\") pod \"keda-admission-cf49989db-hqvtr\" (UID: \"db589ecd-a3e9-4309-9150-6d9b656f1ae0\") " pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:43.255607 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.255502 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/db589ecd-a3e9-4309-9150-6d9b656f1ae0-certificates\") pod \"keda-admission-cf49989db-hqvtr\" (UID: \"db589ecd-a3e9-4309-9150-6d9b656f1ae0\") " pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:43.257884 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.257864 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/db589ecd-a3e9-4309-9150-6d9b656f1ae0-certificates\") pod \"keda-admission-cf49989db-hqvtr\" (UID: \"db589ecd-a3e9-4309-9150-6d9b656f1ae0\") " pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:43.265036 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.265012 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knn75\" (UniqueName: \"kubernetes.io/projected/db589ecd-a3e9-4309-9150-6d9b656f1ae0-kube-api-access-knn75\") pod \"keda-admission-cf49989db-hqvtr\" (UID: \"db589ecd-a3e9-4309-9150-6d9b656f1ae0\") " pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:43.372701 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.372621 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:43.457872 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.457839 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:43.458195 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:43.458141 2561 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:42:43.458195 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:43.458163 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:42:43.458195 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:43.458183 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh: references non-existent secret key: tls.crt Apr 24 16:42:43.458295 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:43.458235 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates podName:d351a555-501e-4735-80d0-4786c893c818 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:44.4582158 +0000 UTC m=+233.311086499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates") pod "keda-metrics-apiserver-7c9f485588-2d2lh" (UID: "d351a555-501e-4735-80d0-4786c893c818") : references non-existent secret key: tls.crt Apr 24 16:42:43.493889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:43.493868 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-hqvtr"] Apr 24 16:42:43.496009 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:42:43.495979 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb589ecd_a3e9_4309_9150_6d9b656f1ae0.slice/crio-abdc72bb155ad7166df489e0f75255cfeaff361f104f648f497eecd52da9a7f5 WatchSource:0}: Error finding container abdc72bb155ad7166df489e0f75255cfeaff361f104f648f497eecd52da9a7f5: Status 404 returned error can't find the container with id abdc72bb155ad7166df489e0f75255cfeaff361f104f648f497eecd52da9a7f5 Apr 24 16:42:44.162094 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:44.162058 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:44.162252 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:44.162209 2561 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:42:44.162252 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:44.162226 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:42:44.162252 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:44.162235 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-44kbh: references non-existent secret key: ca.crt Apr 24 16:42:44.162367 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:44.162286 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates podName:7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e nodeName:}" failed. No retries permitted until 2026-04-24 16:42:46.162271045 +0000 UTC m=+235.015141737 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates") pod "keda-operator-ffbb595cb-44kbh" (UID: "7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e") : references non-existent secret key: ca.crt Apr 24 16:42:44.429432 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:44.429348 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-hqvtr" event={"ID":"db589ecd-a3e9-4309-9150-6d9b656f1ae0","Type":"ContainerStarted","Data":"abdc72bb155ad7166df489e0f75255cfeaff361f104f648f497eecd52da9a7f5"} Apr 24 16:42:44.464820 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:44.464789 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:44.465194 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:44.464907 2561 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:42:44.465194 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:44.464920 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:42:44.465194 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:44.464937 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh: references non-existent secret key: tls.crt Apr 24 16:42:44.465194 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:44.464988 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates podName:d351a555-501e-4735-80d0-4786c893c818 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:46.464971057 +0000 UTC m=+235.317841750 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates") pod "keda-metrics-apiserver-7c9f485588-2d2lh" (UID: "d351a555-501e-4735-80d0-4786c893c818") : references non-existent secret key: tls.crt Apr 24 16:42:45.433832 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:45.433796 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-hqvtr" event={"ID":"db589ecd-a3e9-4309-9150-6d9b656f1ae0","Type":"ContainerStarted","Data":"d8fe5d4038086a8a23ecd1a318a82b33403164e642014a8c9305ca6e72125dd1"} Apr 24 16:42:45.433998 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:45.433879 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:42:45.485511 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:45.485461 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-hqvtr" podStartSLOduration=1.079086466 podStartE2EDuration="2.485444151s" podCreationTimestamp="2026-04-24 16:42:43 +0000 UTC" firstStartedPulling="2026-04-24 16:42:43.497440852 +0000 UTC m=+232.350311547" lastFinishedPulling="2026-04-24 16:42:44.903798537 +0000 UTC m=+233.756669232" observedRunningTime="2026-04-24 16:42:45.483565253 +0000 UTC m=+234.336435968" watchObservedRunningTime="2026-04-24 16:42:45.485444151 +0000 UTC m=+234.338314867" Apr 24 16:42:46.178884 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:46.178850 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:46.179051 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:46.178986 2561 secret.go:281] references non-existent secret key: ca.crt Apr 24 16:42:46.179051 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:46.179000 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 16:42:46.179051 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:46.179008 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-44kbh: references non-existent secret key: ca.crt Apr 24 16:42:46.179185 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:46.179056 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates podName:7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e nodeName:}" failed. No retries permitted until 2026-04-24 16:42:50.179040494 +0000 UTC m=+239.031911197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates") pod "keda-operator-ffbb595cb-44kbh" (UID: "7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e") : references non-existent secret key: ca.crt Apr 24 16:42:46.481437 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:46.481403 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:46.481642 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:46.481523 2561 secret.go:281] references non-existent secret key: tls.crt Apr 24 16:42:46.481642 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:46.481536 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 16:42:46.481642 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:46.481551 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh: references non-existent secret key: tls.crt Apr 24 16:42:46.481642 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:42:46.481598 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates podName:d351a555-501e-4735-80d0-4786c893c818 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:50.481583586 +0000 UTC m=+239.334454278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates") pod "keda-metrics-apiserver-7c9f485588-2d2lh" (UID: "d351a555-501e-4735-80d0-4786c893c818") : references non-existent secret key: tls.crt Apr 24 16:42:50.207514 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.207460 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:50.209814 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.209793 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e-certificates\") pod \"keda-operator-ffbb595cb-44kbh\" (UID: \"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e\") " pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:50.322749 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.322721 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:50.437258 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.437089 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-44kbh"] Apr 24 16:42:50.439805 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:42:50.439777 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b9fe1f6_d10f_49e4_9e7e_ce9ef239655e.slice/crio-66fc7514c411be8e780ad5ef1cc171c7264537cdf08df523fc7eeb0bd3de6472 WatchSource:0}: Error finding container 66fc7514c411be8e780ad5ef1cc171c7264537cdf08df523fc7eeb0bd3de6472: Status 404 returned error can't find the container with id 66fc7514c411be8e780ad5ef1cc171c7264537cdf08df523fc7eeb0bd3de6472 Apr 24 16:42:50.450973 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.450946 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-44kbh" event={"ID":"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e","Type":"ContainerStarted","Data":"66fc7514c411be8e780ad5ef1cc171c7264537cdf08df523fc7eeb0bd3de6472"} Apr 24 16:42:50.509654 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.509587 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:50.511971 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.511947 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d351a555-501e-4735-80d0-4786c893c818-certificates\") pod \"keda-metrics-apiserver-7c9f485588-2d2lh\" (UID: \"d351a555-501e-4735-80d0-4786c893c818\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:50.635252 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.635216 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:50.745905 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:50.745873 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh"] Apr 24 16:42:50.748657 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:42:50.748621 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd351a555_501e_4735_80d0_4786c893c818.slice/crio-d09b0d8e210ff279802e26c671bc97e49983f88fe1740df2293e4ce9aefbdb9b WatchSource:0}: Error finding container d09b0d8e210ff279802e26c671bc97e49983f88fe1740df2293e4ce9aefbdb9b: Status 404 returned error can't find the container with id d09b0d8e210ff279802e26c671bc97e49983f88fe1740df2293e4ce9aefbdb9b Apr 24 16:42:51.455982 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:51.455945 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" event={"ID":"d351a555-501e-4735-80d0-4786c893c818","Type":"ContainerStarted","Data":"d09b0d8e210ff279802e26c671bc97e49983f88fe1740df2293e4ce9aefbdb9b"} Apr 24 16:42:54.468838 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:54.468798 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" event={"ID":"d351a555-501e-4735-80d0-4786c893c818","Type":"ContainerStarted","Data":"25191c7cd223342db0a2c55ed86a38060b57a6174d1949e4757013befa42d05d"} Apr 24 16:42:54.469308 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:54.469011 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:42:54.470037 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:54.470007 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-44kbh" event={"ID":"7b9fe1f6-d10f-49e4-9e7e-ce9ef239655e","Type":"ContainerStarted","Data":"a69e666fa088f212d97b986700b748af95579e2085fd707c0fbd32c930718842"} Apr 24 16:42:54.470172 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:54.470144 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:42:54.487035 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:54.486941 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" podStartSLOduration=8.998346876 podStartE2EDuration="12.486923538s" podCreationTimestamp="2026-04-24 16:42:42 +0000 UTC" firstStartedPulling="2026-04-24 16:42:50.749925042 +0000 UTC m=+239.602795733" lastFinishedPulling="2026-04-24 16:42:54.238501691 +0000 UTC m=+243.091372395" observedRunningTime="2026-04-24 16:42:54.485618351 +0000 UTC m=+243.338489064" watchObservedRunningTime="2026-04-24 16:42:54.486923538 +0000 UTC m=+243.339794254" Apr 24 16:42:54.504503 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:42:54.504447 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-44kbh" podStartSLOduration=8.708066289 podStartE2EDuration="12.504427781s" podCreationTimestamp="2026-04-24 16:42:42 +0000 UTC" firstStartedPulling="2026-04-24 16:42:50.440941558 +0000 UTC m=+239.293812251" lastFinishedPulling="2026-04-24 16:42:54.23730305 +0000 UTC m=+243.090173743" observedRunningTime="2026-04-24 16:42:54.503180117 +0000 UTC m=+243.356050832" watchObservedRunningTime="2026-04-24 16:42:54.504427781 +0000 UTC m=+243.357298497" Apr 24 16:43:03.426692 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:03.426652 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn46j" Apr 24 16:43:05.477541 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:05.477507 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-2d2lh" Apr 24 16:43:06.439630 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:06.439602 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-hqvtr" Apr 24 16:43:15.475450 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:15.475366 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-44kbh" Apr 24 16:43:51.575222 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:51.575179 2561 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:43:52.720865 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.720833 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-pmhk2"] Apr 24 16:43:52.724165 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.724149 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:52.726455 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.726433 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:43:52.726574 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.726433 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:43:52.727433 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.727411 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 16:43:52.727519 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.727479 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-nxbbh\"" Apr 24 16:43:52.734541 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.734522 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-pmhk2"] Apr 24 16:43:52.764812 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.764789 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-wxq7p"] Apr 24 16:43:52.767790 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.767773 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:52.770299 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.770280 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 16:43:52.770405 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.770355 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-j66k8\"" Apr 24 16:43:52.776956 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.776936 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-wxq7p"] Apr 24 16:43:52.782884 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.782861 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert\") pod \"kserve-controller-manager-7f7fb4c66f-pmhk2\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:52.782974 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.782900 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scndg\" (UniqueName: \"kubernetes.io/projected/8f7987de-e1ca-4187-9ea5-6347c18b87b5-kube-api-access-scndg\") pod \"seaweedfs-86cc847c5c-wxq7p\" (UID: \"8f7987de-e1ca-4187-9ea5-6347c18b87b5\") " pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:52.782974 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.782950 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqthx\" (UniqueName: \"kubernetes.io/projected/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-kube-api-access-xqthx\") pod \"kserve-controller-manager-7f7fb4c66f-pmhk2\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:52.783059 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.783010 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8f7987de-e1ca-4187-9ea5-6347c18b87b5-data\") pod \"seaweedfs-86cc847c5c-wxq7p\" (UID: \"8f7987de-e1ca-4187-9ea5-6347c18b87b5\") " pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:52.883495 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.883465 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqthx\" (UniqueName: \"kubernetes.io/projected/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-kube-api-access-xqthx\") pod \"kserve-controller-manager-7f7fb4c66f-pmhk2\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:52.883630 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.883511 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8f7987de-e1ca-4187-9ea5-6347c18b87b5-data\") pod \"seaweedfs-86cc847c5c-wxq7p\" (UID: \"8f7987de-e1ca-4187-9ea5-6347c18b87b5\") " pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:52.883630 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.883575 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert\") pod \"kserve-controller-manager-7f7fb4c66f-pmhk2\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:52.883704 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:43:52.883672 2561 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 24 16:43:52.883737 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:43:52.883733 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert podName:cf66743c-a822-49b5-8ccd-3ed04c6eda8e nodeName:}" failed. No retries permitted until 2026-04-24 16:43:53.38371498 +0000 UTC m=+302.236585687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert") pod "kserve-controller-manager-7f7fb4c66f-pmhk2" (UID: "cf66743c-a822-49b5-8ccd-3ed04c6eda8e") : secret "kserve-webhook-server-cert" not found Apr 24 16:43:52.883832 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.883812 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scndg\" (UniqueName: \"kubernetes.io/projected/8f7987de-e1ca-4187-9ea5-6347c18b87b5-kube-api-access-scndg\") pod \"seaweedfs-86cc847c5c-wxq7p\" (UID: \"8f7987de-e1ca-4187-9ea5-6347c18b87b5\") " pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:52.884004 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.883982 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8f7987de-e1ca-4187-9ea5-6347c18b87b5-data\") pod \"seaweedfs-86cc847c5c-wxq7p\" (UID: \"8f7987de-e1ca-4187-9ea5-6347c18b87b5\") " pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:52.892884 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.892859 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqthx\" (UniqueName: \"kubernetes.io/projected/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-kube-api-access-xqthx\") pod \"kserve-controller-manager-7f7fb4c66f-pmhk2\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:52.892971 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:52.892955 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scndg\" (UniqueName: \"kubernetes.io/projected/8f7987de-e1ca-4187-9ea5-6347c18b87b5-kube-api-access-scndg\") pod \"seaweedfs-86cc847c5c-wxq7p\" (UID: \"8f7987de-e1ca-4187-9ea5-6347c18b87b5\") " pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:53.076929 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:53.076850 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:53.194733 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:53.194698 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-wxq7p"] Apr 24 16:43:53.197736 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:43:53.197705 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f7987de_e1ca_4187_9ea5_6347c18b87b5.slice/crio-bb5bda6debfd02ba730411e01f457c294de6d9da7d55468dc545dbbb921e1887 WatchSource:0}: Error finding container bb5bda6debfd02ba730411e01f457c294de6d9da7d55468dc545dbbb921e1887: Status 404 returned error can't find the container with id bb5bda6debfd02ba730411e01f457c294de6d9da7d55468dc545dbbb921e1887 Apr 24 16:43:53.198867 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:53.198850 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:43:53.387513 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:53.387426 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert\") pod \"kserve-controller-manager-7f7fb4c66f-pmhk2\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:53.390109 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:53.390090 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert\") pod \"kserve-controller-manager-7f7fb4c66f-pmhk2\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:53.636913 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:53.636617 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:53.640727 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:53.640655 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-wxq7p" event={"ID":"8f7987de-e1ca-4187-9ea5-6347c18b87b5","Type":"ContainerStarted","Data":"bb5bda6debfd02ba730411e01f457c294de6d9da7d55468dc545dbbb921e1887"} Apr 24 16:43:53.765539 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:53.765505 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-pmhk2"] Apr 24 16:43:53.805036 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:43:53.804989 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf66743c_a822_49b5_8ccd_3ed04c6eda8e.slice/crio-ed8938df1838ebf73cb077f41ddd32c997b3c4ab1c57f59ce64559abef4bb729 WatchSource:0}: Error finding container ed8938df1838ebf73cb077f41ddd32c997b3c4ab1c57f59ce64559abef4bb729: Status 404 returned error can't find the container with id ed8938df1838ebf73cb077f41ddd32c997b3c4ab1c57f59ce64559abef4bb729 Apr 24 16:43:54.652782 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:54.652740 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" event={"ID":"cf66743c-a822-49b5-8ccd-3ed04c6eda8e","Type":"ContainerStarted","Data":"ed8938df1838ebf73cb077f41ddd32c997b3c4ab1c57f59ce64559abef4bb729"} Apr 24 16:43:57.662805 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:57.662715 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-wxq7p" event={"ID":"8f7987de-e1ca-4187-9ea5-6347c18b87b5","Type":"ContainerStarted","Data":"76ae26f95c9ae67d1331f9415b4fa17345237db1237c55a220ed670cf454e984"} Apr 24 16:43:57.663415 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:57.662805 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:43:57.664038 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:57.664017 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" event={"ID":"cf66743c-a822-49b5-8ccd-3ed04c6eda8e","Type":"ContainerStarted","Data":"bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83"} Apr 24 16:43:57.664177 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:57.664164 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:43:57.679375 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:57.679326 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-wxq7p" podStartSLOduration=1.4958727760000001 podStartE2EDuration="5.679313587s" podCreationTimestamp="2026-04-24 16:43:52 +0000 UTC" firstStartedPulling="2026-04-24 16:43:53.198998164 +0000 UTC m=+302.051868855" lastFinishedPulling="2026-04-24 16:43:57.382438972 +0000 UTC m=+306.235309666" observedRunningTime="2026-04-24 16:43:57.678415046 +0000 UTC m=+306.531285759" watchObservedRunningTime="2026-04-24 16:43:57.679313587 +0000 UTC m=+306.532184329" Apr 24 16:43:57.695236 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:43:57.695179 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" podStartSLOduration=2.173233322 podStartE2EDuration="5.695164735s" podCreationTimestamp="2026-04-24 16:43:52 +0000 UTC" firstStartedPulling="2026-04-24 16:43:53.806583508 +0000 UTC m=+302.659454203" lastFinishedPulling="2026-04-24 16:43:57.32851492 +0000 UTC m=+306.181385616" observedRunningTime="2026-04-24 16:43:57.694481799 +0000 UTC m=+306.547352513" watchObservedRunningTime="2026-04-24 16:43:57.695164735 +0000 UTC m=+306.548035450" Apr 24 16:44:03.668931 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:03.668902 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-wxq7p" Apr 24 16:44:28.201919 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.201887 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-pmhk2"] Apr 24 16:44:28.202381 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.202142 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" podUID="cf66743c-a822-49b5-8ccd-3ed04c6eda8e" containerName="manager" containerID="cri-o://bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83" gracePeriod=10 Apr 24 16:44:28.207152 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.207127 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:44:28.228208 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.228185 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-gczs8"] Apr 24 16:44:28.231279 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.231264 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:28.238740 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.238718 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-gczs8"] Apr 24 16:44:28.338449 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.338419 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcbm2\" (UniqueName: \"kubernetes.io/projected/d1cf6eeb-9ea9-470c-852e-c4f262ddbca3-kube-api-access-zcbm2\") pod \"kserve-controller-manager-7f7fb4c66f-gczs8\" (UID: \"d1cf6eeb-9ea9-470c-852e-c4f262ddbca3\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:28.338562 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.338474 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1cf6eeb-9ea9-470c-852e-c4f262ddbca3-cert\") pod \"kserve-controller-manager-7f7fb4c66f-gczs8\" (UID: \"d1cf6eeb-9ea9-470c-852e-c4f262ddbca3\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:28.438939 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.438910 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1cf6eeb-9ea9-470c-852e-c4f262ddbca3-cert\") pod \"kserve-controller-manager-7f7fb4c66f-gczs8\" (UID: \"d1cf6eeb-9ea9-470c-852e-c4f262ddbca3\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:28.439072 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.438989 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcbm2\" (UniqueName: \"kubernetes.io/projected/d1cf6eeb-9ea9-470c-852e-c4f262ddbca3-kube-api-access-zcbm2\") pod \"kserve-controller-manager-7f7fb4c66f-gczs8\" (UID: \"d1cf6eeb-9ea9-470c-852e-c4f262ddbca3\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:28.441288 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.441269 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:44:28.441384 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.441319 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1cf6eeb-9ea9-470c-852e-c4f262ddbca3-cert\") pod \"kserve-controller-manager-7f7fb4c66f-gczs8\" (UID: \"d1cf6eeb-9ea9-470c-852e-c4f262ddbca3\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:28.447275 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.447255 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcbm2\" (UniqueName: \"kubernetes.io/projected/d1cf6eeb-9ea9-470c-852e-c4f262ddbca3-kube-api-access-zcbm2\") pod \"kserve-controller-manager-7f7fb4c66f-gczs8\" (UID: \"d1cf6eeb-9ea9-470c-852e-c4f262ddbca3\") " pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:28.539607 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.539537 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert\") pod \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " Apr 24 16:44:28.539723 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.539611 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqthx\" (UniqueName: \"kubernetes.io/projected/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-kube-api-access-xqthx\") pod \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\" (UID: \"cf66743c-a822-49b5-8ccd-3ed04c6eda8e\") " Apr 24 16:44:28.541640 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.541609 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-kube-api-access-xqthx" (OuterVolumeSpecName: "kube-api-access-xqthx") pod "cf66743c-a822-49b5-8ccd-3ed04c6eda8e" (UID: "cf66743c-a822-49b5-8ccd-3ed04c6eda8e"). InnerVolumeSpecName "kube-api-access-xqthx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:44:28.541740 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.541643 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert" (OuterVolumeSpecName: "cert") pod "cf66743c-a822-49b5-8ccd-3ed04c6eda8e" (UID: "cf66743c-a822-49b5-8ccd-3ed04c6eda8e"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:28.587837 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.587816 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:28.641039 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.641001 2561 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-cert\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:44:28.641202 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.641041 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqthx\" (UniqueName: \"kubernetes.io/projected/cf66743c-a822-49b5-8ccd-3ed04c6eda8e-kube-api-access-xqthx\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:44:28.701623 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.701572 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-gczs8"] Apr 24 16:44:28.704064 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:44:28.704039 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1cf6eeb_9ea9_470c_852e_c4f262ddbca3.slice/crio-f1d8dcbdc78c237e54d2ea4d3b982f7754c7ec06846ad2272eae81a8e69eb5c9 WatchSource:0}: Error finding container f1d8dcbdc78c237e54d2ea4d3b982f7754c7ec06846ad2272eae81a8e69eb5c9: Status 404 returned error can't find the container with id f1d8dcbdc78c237e54d2ea4d3b982f7754c7ec06846ad2272eae81a8e69eb5c9 Apr 24 16:44:28.749029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.748999 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" event={"ID":"d1cf6eeb-9ea9-470c-852e-c4f262ddbca3","Type":"ContainerStarted","Data":"f1d8dcbdc78c237e54d2ea4d3b982f7754c7ec06846ad2272eae81a8e69eb5c9"} Apr 24 16:44:28.750081 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.750057 2561 generic.go:358] "Generic (PLEG): container finished" podID="cf66743c-a822-49b5-8ccd-3ed04c6eda8e" containerID="bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83" exitCode=0 Apr 24 16:44:28.750181 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.750135 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" event={"ID":"cf66743c-a822-49b5-8ccd-3ed04c6eda8e","Type":"ContainerDied","Data":"bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83"} Apr 24 16:44:28.750181 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.750145 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" Apr 24 16:44:28.750181 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.750165 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-pmhk2" event={"ID":"cf66743c-a822-49b5-8ccd-3ed04c6eda8e","Type":"ContainerDied","Data":"ed8938df1838ebf73cb077f41ddd32c997b3c4ab1c57f59ce64559abef4bb729"} Apr 24 16:44:28.750290 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.750185 2561 scope.go:117] "RemoveContainer" containerID="bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83" Apr 24 16:44:28.757685 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.757668 2561 scope.go:117] "RemoveContainer" containerID="bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83" Apr 24 16:44:28.757937 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:44:28.757920 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83\": container with ID starting with bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83 not found: ID does not exist" containerID="bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83" Apr 24 16:44:28.758008 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.757945 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83"} err="failed to get container status \"bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83\": rpc error: code = NotFound desc = could not find container \"bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83\": container with ID starting with bd2a61d4abb5e5339de91bb1d9e03051c7cad63144792aaee66916f403e46b83 not found: ID does not exist" Apr 24 16:44:28.772159 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.772138 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-pmhk2"] Apr 24 16:44:28.777873 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:28.777854 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7f7fb4c66f-pmhk2"] Apr 24 16:44:29.651190 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:29.651158 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf66743c-a822-49b5-8ccd-3ed04c6eda8e" path="/var/lib/kubelet/pods/cf66743c-a822-49b5-8ccd-3ed04c6eda8e/volumes" Apr 24 16:44:29.754167 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:29.754129 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" event={"ID":"d1cf6eeb-9ea9-470c-852e-c4f262ddbca3","Type":"ContainerStarted","Data":"1a19388170f8f34a18e12e92baae09b4683fa80a50437fad4af27cb47da90472"} Apr 24 16:44:29.754336 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:29.754264 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:44:29.773249 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:44:29.773202 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" podStartSLOduration=1.486561821 podStartE2EDuration="1.773187998s" podCreationTimestamp="2026-04-24 16:44:28 +0000 UTC" firstStartedPulling="2026-04-24 16:44:28.705255195 +0000 UTC m=+337.558125887" lastFinishedPulling="2026-04-24 16:44:28.991881367 +0000 UTC m=+337.844752064" observedRunningTime="2026-04-24 16:44:29.771323916 +0000 UTC m=+338.624194630" watchObservedRunningTime="2026-04-24 16:44:29.773187998 +0000 UTC m=+338.626058711" Apr 24 16:45:00.762730 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:00.762697 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7f7fb4c66f-gczs8" Apr 24 16:45:01.601756 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.601721 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-879lh"] Apr 24 16:45:01.602225 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.602210 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf66743c-a822-49b5-8ccd-3ed04c6eda8e" containerName="manager" Apr 24 16:45:01.602273 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.602229 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf66743c-a822-49b5-8ccd-3ed04c6eda8e" containerName="manager" Apr 24 16:45:01.602307 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.602302 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf66743c-a822-49b5-8ccd-3ed04c6eda8e" containerName="manager" Apr 24 16:45:01.605367 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.605350 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:01.608059 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.608036 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-d5rrx\"" Apr 24 16:45:01.608190 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.608084 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 16:45:01.615201 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.615181 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-879lh"] Apr 24 16:45:01.622235 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.622212 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-vfd2q"] Apr 24 16:45:01.625215 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.625196 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:01.628022 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.627999 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 16:45:01.628170 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.628057 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-q4nlb\"" Apr 24 16:45:01.635528 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.635508 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-vfd2q"] Apr 24 16:45:01.689820 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.689798 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45d6ccd2-4d56-43df-b37c-538044e628e0-tls-certs\") pod \"model-serving-api-86f7b4b499-879lh\" (UID: \"45d6ccd2-4d56-43df-b37c-538044e628e0\") " pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:01.689967 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.689866 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrg9\" (UniqueName: \"kubernetes.io/projected/45d6ccd2-4d56-43df-b37c-538044e628e0-kube-api-access-fcrg9\") pod \"model-serving-api-86f7b4b499-879lh\" (UID: \"45d6ccd2-4d56-43df-b37c-538044e628e0\") " pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:01.790960 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.790917 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkf9k\" (UniqueName: \"kubernetes.io/projected/6617c0d5-1e92-46e7-ba6f-65a16498f29b-kube-api-access-mkf9k\") pod \"odh-model-controller-696fc77849-vfd2q\" (UID: \"6617c0d5-1e92-46e7-ba6f-65a16498f29b\") " pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:01.791398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.790976 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcrg9\" (UniqueName: \"kubernetes.io/projected/45d6ccd2-4d56-43df-b37c-538044e628e0-kube-api-access-fcrg9\") pod \"model-serving-api-86f7b4b499-879lh\" (UID: \"45d6ccd2-4d56-43df-b37c-538044e628e0\") " pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:01.791398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.791003 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6617c0d5-1e92-46e7-ba6f-65a16498f29b-cert\") pod \"odh-model-controller-696fc77849-vfd2q\" (UID: \"6617c0d5-1e92-46e7-ba6f-65a16498f29b\") " pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:01.791398 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.791054 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45d6ccd2-4d56-43df-b37c-538044e628e0-tls-certs\") pod \"model-serving-api-86f7b4b499-879lh\" (UID: \"45d6ccd2-4d56-43df-b37c-538044e628e0\") " pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:01.791398 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:45:01.791178 2561 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 24 16:45:01.791398 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:45:01.791241 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45d6ccd2-4d56-43df-b37c-538044e628e0-tls-certs podName:45d6ccd2-4d56-43df-b37c-538044e628e0 nodeName:}" failed. No retries permitted until 2026-04-24 16:45:02.291223296 +0000 UTC m=+371.144093988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/45d6ccd2-4d56-43df-b37c-538044e628e0-tls-certs") pod "model-serving-api-86f7b4b499-879lh" (UID: "45d6ccd2-4d56-43df-b37c-538044e628e0") : secret "model-serving-api-tls" not found Apr 24 16:45:01.805203 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.805169 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcrg9\" (UniqueName: \"kubernetes.io/projected/45d6ccd2-4d56-43df-b37c-538044e628e0-kube-api-access-fcrg9\") pod \"model-serving-api-86f7b4b499-879lh\" (UID: \"45d6ccd2-4d56-43df-b37c-538044e628e0\") " pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:01.892587 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.892484 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkf9k\" (UniqueName: \"kubernetes.io/projected/6617c0d5-1e92-46e7-ba6f-65a16498f29b-kube-api-access-mkf9k\") pod \"odh-model-controller-696fc77849-vfd2q\" (UID: \"6617c0d5-1e92-46e7-ba6f-65a16498f29b\") " pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:01.892587 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.892527 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6617c0d5-1e92-46e7-ba6f-65a16498f29b-cert\") pod \"odh-model-controller-696fc77849-vfd2q\" (UID: \"6617c0d5-1e92-46e7-ba6f-65a16498f29b\") " pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:01.894888 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.894868 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6617c0d5-1e92-46e7-ba6f-65a16498f29b-cert\") pod \"odh-model-controller-696fc77849-vfd2q\" (UID: \"6617c0d5-1e92-46e7-ba6f-65a16498f29b\") " pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:01.900389 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.900366 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkf9k\" (UniqueName: \"kubernetes.io/projected/6617c0d5-1e92-46e7-ba6f-65a16498f29b-kube-api-access-mkf9k\") pod \"odh-model-controller-696fc77849-vfd2q\" (UID: \"6617c0d5-1e92-46e7-ba6f-65a16498f29b\") " pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:01.937517 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:01.937484 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:02.056604 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:02.056576 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-vfd2q"] Apr 24 16:45:02.059045 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:45:02.059002 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6617c0d5_1e92_46e7_ba6f_65a16498f29b.slice/crio-2d1be71d3a6ff3a0f416194166590e88b0b012f874f14452a4a8cc7aa8a7b20b WatchSource:0}: Error finding container 2d1be71d3a6ff3a0f416194166590e88b0b012f874f14452a4a8cc7aa8a7b20b: Status 404 returned error can't find the container with id 2d1be71d3a6ff3a0f416194166590e88b0b012f874f14452a4a8cc7aa8a7b20b Apr 24 16:45:02.295990 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:02.295955 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45d6ccd2-4d56-43df-b37c-538044e628e0-tls-certs\") pod \"model-serving-api-86f7b4b499-879lh\" (UID: \"45d6ccd2-4d56-43df-b37c-538044e628e0\") " pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:02.298307 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:02.298288 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45d6ccd2-4d56-43df-b37c-538044e628e0-tls-certs\") pod \"model-serving-api-86f7b4b499-879lh\" (UID: \"45d6ccd2-4d56-43df-b37c-538044e628e0\") " pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:02.517535 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:02.517498 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:02.633632 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:02.633611 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-879lh"] Apr 24 16:45:02.635146 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:45:02.635109 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45d6ccd2_4d56_43df_b37c_538044e628e0.slice/crio-17686772f8ac3de342ea2c150952bcab9d14e8d2e56f86faecf9e15c54649c94 WatchSource:0}: Error finding container 17686772f8ac3de342ea2c150952bcab9d14e8d2e56f86faecf9e15c54649c94: Status 404 returned error can't find the container with id 17686772f8ac3de342ea2c150952bcab9d14e8d2e56f86faecf9e15c54649c94 Apr 24 16:45:02.851280 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:02.851193 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-vfd2q" event={"ID":"6617c0d5-1e92-46e7-ba6f-65a16498f29b","Type":"ContainerStarted","Data":"2d1be71d3a6ff3a0f416194166590e88b0b012f874f14452a4a8cc7aa8a7b20b"} Apr 24 16:45:02.852278 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:02.852254 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-879lh" event={"ID":"45d6ccd2-4d56-43df-b37c-538044e628e0","Type":"ContainerStarted","Data":"17686772f8ac3de342ea2c150952bcab9d14e8d2e56f86faecf9e15c54649c94"} Apr 24 16:45:05.863073 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:05.863041 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-vfd2q" event={"ID":"6617c0d5-1e92-46e7-ba6f-65a16498f29b","Type":"ContainerStarted","Data":"c624a000ba483323c6f30ad35dd1eb667dae18c6a67c8aed36edc12322b0f145"} Apr 24 16:45:05.863073 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:05.863085 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:05.864404 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:05.864379 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-879lh" event={"ID":"45d6ccd2-4d56-43df-b37c-538044e628e0","Type":"ContainerStarted","Data":"9a50fff72fd5ec0be875b9d6fd3b95b9e99f7e894d66ea5657f21e227798a41f"} Apr 24 16:45:05.864514 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:05.864468 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:05.879866 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:05.879827 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-vfd2q" podStartSLOduration=1.5537551939999998 podStartE2EDuration="4.87981401s" podCreationTimestamp="2026-04-24 16:45:01 +0000 UTC" firstStartedPulling="2026-04-24 16:45:02.060550089 +0000 UTC m=+370.913420781" lastFinishedPulling="2026-04-24 16:45:05.386608903 +0000 UTC m=+374.239479597" observedRunningTime="2026-04-24 16:45:05.879006041 +0000 UTC m=+374.731876756" watchObservedRunningTime="2026-04-24 16:45:05.87981401 +0000 UTC m=+374.732684723" Apr 24 16:45:05.895581 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:05.895540 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-879lh" podStartSLOduration=2.14883222 podStartE2EDuration="4.895528288s" podCreationTimestamp="2026-04-24 16:45:01 +0000 UTC" firstStartedPulling="2026-04-24 16:45:02.636873193 +0000 UTC m=+371.489743885" lastFinishedPulling="2026-04-24 16:45:05.383569251 +0000 UTC m=+374.236439953" observedRunningTime="2026-04-24 16:45:05.894108384 +0000 UTC m=+374.746979098" watchObservedRunningTime="2026-04-24 16:45:05.895528288 +0000 UTC m=+374.748399001" Apr 24 16:45:16.869803 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:16.869772 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-vfd2q" Apr 24 16:45:16.871475 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:16.871455 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-879lh" Apr 24 16:45:37.693551 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.693515 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8"] Apr 24 16:45:37.700830 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.700807 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.703447 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.703423 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:45:37.703676 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.703662 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-32209-predictor-serving-cert\"" Apr 24 16:45:37.704528 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.704504 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-32209-kube-rbac-proxy-sar-config\"" Apr 24 16:45:37.704528 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.704534 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gd6p9\"" Apr 24 16:45:37.704670 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.704547 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:45:37.707957 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.707936 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8"] Apr 24 16:45:37.782509 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.782473 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.782665 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.782536 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpr5j\" (UniqueName: \"kubernetes.io/projected/9b01fef6-6066-43b8-a153-e14d7d125268-kube-api-access-bpr5j\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.782665 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.782584 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b01fef6-6066-43b8-a153-e14d7d125268-success-200-isvc-32209-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.883045 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.883007 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b01fef6-6066-43b8-a153-e14d7d125268-success-200-isvc-32209-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.883218 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.883086 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.883218 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.883164 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpr5j\" (UniqueName: \"kubernetes.io/projected/9b01fef6-6066-43b8-a153-e14d7d125268-kube-api-access-bpr5j\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.883331 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:45:37.883243 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-32209-predictor-serving-cert: secret "success-200-isvc-32209-predictor-serving-cert" not found Apr 24 16:45:37.883331 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:45:37.883327 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls podName:9b01fef6-6066-43b8-a153-e14d7d125268 nodeName:}" failed. No retries permitted until 2026-04-24 16:45:38.38330302 +0000 UTC m=+407.236173718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls") pod "success-200-isvc-32209-predictor-649bfcd444-vqfb8" (UID: "9b01fef6-6066-43b8-a153-e14d7d125268") : secret "success-200-isvc-32209-predictor-serving-cert" not found Apr 24 16:45:37.883792 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.883769 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b01fef6-6066-43b8-a153-e14d7d125268-success-200-isvc-32209-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.898130 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.898087 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpr5j\" (UniqueName: \"kubernetes.io/projected/9b01fef6-6066-43b8-a153-e14d7d125268-kube-api-access-bpr5j\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:37.990312 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:37.990230 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5"] Apr 24 16:45:38.028246 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.028222 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5"] Apr 24 16:45:38.028442 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.028375 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.031257 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.031237 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-32209-predictor-serving-cert\"" Apr 24 16:45:38.031373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.031237 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-32209-kube-rbac-proxy-sar-config\"" Apr 24 16:45:38.085921 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.085896 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.086182 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.086162 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtzgj\" (UniqueName: \"kubernetes.io/projected/3a5f72e4-d8de-42d7-9078-2fa28a50203d-kube-api-access-mtzgj\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.086370 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.086338 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a5f72e4-d8de-42d7-9078-2fa28a50203d-error-404-isvc-32209-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.187277 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.187246 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.187447 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.187299 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtzgj\" (UniqueName: \"kubernetes.io/projected/3a5f72e4-d8de-42d7-9078-2fa28a50203d-kube-api-access-mtzgj\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.187447 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.187334 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a5f72e4-d8de-42d7-9078-2fa28a50203d-error-404-isvc-32209-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.188089 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:45:38.188070 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-32209-predictor-serving-cert: secret "error-404-isvc-32209-predictor-serving-cert" not found Apr 24 16:45:38.188161 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.188100 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a5f72e4-d8de-42d7-9078-2fa28a50203d-error-404-isvc-32209-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.188223 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:45:38.188166 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls podName:3a5f72e4-d8de-42d7-9078-2fa28a50203d nodeName:}" failed. No retries permitted until 2026-04-24 16:45:38.688147079 +0000 UTC m=+407.541017777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls") pod "error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" (UID: "3a5f72e4-d8de-42d7-9078-2fa28a50203d") : secret "error-404-isvc-32209-predictor-serving-cert" not found Apr 24 16:45:38.197381 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.197354 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtzgj\" (UniqueName: \"kubernetes.io/projected/3a5f72e4-d8de-42d7-9078-2fa28a50203d-kube-api-access-mtzgj\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.388916 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.388833 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:38.391069 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.391044 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls\") pod \"success-200-isvc-32209-predictor-649bfcd444-vqfb8\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:38.612696 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.612662 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:38.691210 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.691182 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.694133 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.694084 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls\") pod \"error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.738535 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.738450 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8"] Apr 24 16:45:38.740789 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:45:38.740760 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b01fef6_6066_43b8_a153_e14d7d125268.slice/crio-e59c06c20da534feeb825cc8dd426cea432b79b0c2ff890702ec124c88f59f4e WatchSource:0}: Error finding container e59c06c20da534feeb825cc8dd426cea432b79b0c2ff890702ec124c88f59f4e: Status 404 returned error can't find the container with id e59c06c20da534feeb825cc8dd426cea432b79b0c2ff890702ec124c88f59f4e Apr 24 16:45:38.943962 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.943932 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:38.972657 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:38.972627 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" event={"ID":"9b01fef6-6066-43b8-a153-e14d7d125268","Type":"ContainerStarted","Data":"e59c06c20da534feeb825cc8dd426cea432b79b0c2ff890702ec124c88f59f4e"} Apr 24 16:45:39.081671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:39.081648 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5"] Apr 24 16:45:39.083760 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:45:39.083725 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a5f72e4_d8de_42d7_9078_2fa28a50203d.slice/crio-d8e2309920d3b0867047951b40707662a11620a121e51cfed3ccdf1038be6379 WatchSource:0}: Error finding container d8e2309920d3b0867047951b40707662a11620a121e51cfed3ccdf1038be6379: Status 404 returned error can't find the container with id d8e2309920d3b0867047951b40707662a11620a121e51cfed3ccdf1038be6379 Apr 24 16:45:40.001726 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:40.001153 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" event={"ID":"3a5f72e4-d8de-42d7-9078-2fa28a50203d","Type":"ContainerStarted","Data":"d8e2309920d3b0867047951b40707662a11620a121e51cfed3ccdf1038be6379"} Apr 24 16:45:55.084428 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:55.084385 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" event={"ID":"9b01fef6-6066-43b8-a153-e14d7d125268","Type":"ContainerStarted","Data":"3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa"} Apr 24 16:45:55.086040 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:55.086012 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" event={"ID":"3a5f72e4-d8de-42d7-9078-2fa28a50203d","Type":"ContainerStarted","Data":"1b473988aa85e15f65dd76b249f923f3195fe9141fa84a2db0bad1c87578b071"} Apr 24 16:45:57.094021 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.093986 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" event={"ID":"9b01fef6-6066-43b8-a153-e14d7d125268","Type":"ContainerStarted","Data":"82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7"} Apr 24 16:45:57.094493 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.094131 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:57.094493 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.094227 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:45:57.095414 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.095375 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 16:45:57.095717 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.095685 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" event={"ID":"3a5f72e4-d8de-42d7-9078-2fa28a50203d","Type":"ContainerStarted","Data":"0515c2acbfc2772445f360c21eccf8812e10602d69c9f401d4498fd678780208"} Apr 24 16:45:57.095839 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.095824 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:57.095893 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.095847 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:45:57.096743 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.096723 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 16:45:57.177050 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.176992 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podStartSLOduration=1.9769598 podStartE2EDuration="20.176975394s" podCreationTimestamp="2026-04-24 16:45:37 +0000 UTC" firstStartedPulling="2026-04-24 16:45:38.74246383 +0000 UTC m=+407.595334521" lastFinishedPulling="2026-04-24 16:45:56.942479416 +0000 UTC m=+425.795350115" observedRunningTime="2026-04-24 16:45:57.146932801 +0000 UTC m=+425.999803513" watchObservedRunningTime="2026-04-24 16:45:57.176975394 +0000 UTC m=+426.029846108" Apr 24 16:45:57.177318 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:57.177287 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podStartSLOduration=2.312902311 podStartE2EDuration="20.177279917s" podCreationTimestamp="2026-04-24 16:45:37 +0000 UTC" firstStartedPulling="2026-04-24 16:45:39.085832345 +0000 UTC m=+407.938703051" lastFinishedPulling="2026-04-24 16:45:56.950209961 +0000 UTC m=+425.803080657" observedRunningTime="2026-04-24 16:45:57.176645674 +0000 UTC m=+426.029516387" watchObservedRunningTime="2026-04-24 16:45:57.177279917 +0000 UTC m=+426.030150630" Apr 24 16:45:58.099293 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:58.099248 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 16:45:58.099695 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:45:58.099433 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 16:46:03.104335 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:03.104300 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:46:03.104740 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:03.104508 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:46:03.104891 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:03.104867 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 16:46:03.104957 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:03.104866 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 16:46:13.105682 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:13.105596 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 16:46:13.106160 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:13.105596 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 16:46:22.485355 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:22.485320 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7fdc7876ff-nqcg8"] Apr 24 16:46:23.105560 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:23.105520 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 16:46:23.105717 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:23.105520 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 16:46:33.104958 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:33.104914 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 16:46:33.105417 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:33.104912 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 16:46:43.106284 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:43.106255 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:46:43.106693 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:43.106377 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:46:47.504081 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.504046 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7fdc7876ff-nqcg8" podUID="9d0e08a6-353b-4ea7-a22c-12a3a3f80932" containerName="console" containerID="cri-o://c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68" gracePeriod=15 Apr 24 16:46:47.748172 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.748152 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fdc7876ff-nqcg8_9d0e08a6-353b-4ea7-a22c-12a3a3f80932/console/0.log" Apr 24 16:46:47.748288 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.748206 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:46:47.876644 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.876566 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-oauth-serving-cert\") pod \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " Apr 24 16:46:47.876644 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.876598 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-config\") pod \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " Apr 24 16:46:47.876644 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.876618 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-oauth-config\") pod \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " Apr 24 16:46:47.876868 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.876677 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-serving-cert\") pod \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " Apr 24 16:46:47.876868 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.876732 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-trusted-ca-bundle\") pod \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " Apr 24 16:46:47.876868 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.876753 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-service-ca\") pod \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " Apr 24 16:46:47.876868 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.876794 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xv5w\" (UniqueName: \"kubernetes.io/projected/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-kube-api-access-4xv5w\") pod \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\" (UID: \"9d0e08a6-353b-4ea7-a22c-12a3a3f80932\") " Apr 24 16:46:47.877475 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.877435 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9d0e08a6-353b-4ea7-a22c-12a3a3f80932" (UID: "9d0e08a6-353b-4ea7-a22c-12a3a3f80932"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:46:47.877475 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.877442 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9d0e08a6-353b-4ea7-a22c-12a3a3f80932" (UID: "9d0e08a6-353b-4ea7-a22c-12a3a3f80932"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:46:47.877675 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.877639 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-service-ca" (OuterVolumeSpecName: "service-ca") pod "9d0e08a6-353b-4ea7-a22c-12a3a3f80932" (UID: "9d0e08a6-353b-4ea7-a22c-12a3a3f80932"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:46:47.877748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.877718 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-config" (OuterVolumeSpecName: "console-config") pod "9d0e08a6-353b-4ea7-a22c-12a3a3f80932" (UID: "9d0e08a6-353b-4ea7-a22c-12a3a3f80932"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:46:47.879792 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.879764 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9d0e08a6-353b-4ea7-a22c-12a3a3f80932" (UID: "9d0e08a6-353b-4ea7-a22c-12a3a3f80932"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:46:47.883959 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.883933 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-kube-api-access-4xv5w" (OuterVolumeSpecName: "kube-api-access-4xv5w") pod "9d0e08a6-353b-4ea7-a22c-12a3a3f80932" (UID: "9d0e08a6-353b-4ea7-a22c-12a3a3f80932"). InnerVolumeSpecName "kube-api-access-4xv5w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:46:47.884263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.884238 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9d0e08a6-353b-4ea7-a22c-12a3a3f80932" (UID: "9d0e08a6-353b-4ea7-a22c-12a3a3f80932"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:46:47.978554 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.978528 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4xv5w\" (UniqueName: \"kubernetes.io/projected/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-kube-api-access-4xv5w\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:46:47.978554 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.978549 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-oauth-serving-cert\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:46:47.978671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.978560 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:46:47.978671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.978569 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-oauth-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:46:47.978671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.978585 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-console-serving-cert\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:46:47.978671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.978594 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-trusted-ca-bundle\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:46:47.978671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:47.978602 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9d0e08a6-353b-4ea7-a22c-12a3a3f80932-service-ca\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:46:48.255045 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.255022 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fdc7876ff-nqcg8_9d0e08a6-353b-4ea7-a22c-12a3a3f80932/console/0.log" Apr 24 16:46:48.255235 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.255060 2561 generic.go:358] "Generic (PLEG): container finished" podID="9d0e08a6-353b-4ea7-a22c-12a3a3f80932" containerID="c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68" exitCode=2 Apr 24 16:46:48.255235 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.255093 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fdc7876ff-nqcg8" event={"ID":"9d0e08a6-353b-4ea7-a22c-12a3a3f80932","Type":"ContainerDied","Data":"c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68"} Apr 24 16:46:48.255235 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.255147 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fdc7876ff-nqcg8" event={"ID":"9d0e08a6-353b-4ea7-a22c-12a3a3f80932","Type":"ContainerDied","Data":"0f1f706ed80bf87cfa0091a562ea02a323ef844267e281232abeb555e4d8bb38"} Apr 24 16:46:48.255235 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.255164 2561 scope.go:117] "RemoveContainer" containerID="c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68" Apr 24 16:46:48.255235 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.255172 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fdc7876ff-nqcg8" Apr 24 16:46:48.263192 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.263175 2561 scope.go:117] "RemoveContainer" containerID="c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68" Apr 24 16:46:48.263441 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:46:48.263423 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68\": container with ID starting with c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68 not found: ID does not exist" containerID="c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68" Apr 24 16:46:48.263495 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.263451 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68"} err="failed to get container status \"c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68\": rpc error: code = NotFound desc = could not find container \"c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68\": container with ID starting with c6df665e90f247bebef395ddbd0b5eb9cb57507836c6c49d9cf27f50a1ee0a68 not found: ID does not exist" Apr 24 16:46:48.276622 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.276601 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7fdc7876ff-nqcg8"] Apr 24 16:46:48.284269 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:48.284248 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7fdc7876ff-nqcg8"] Apr 24 16:46:49.651293 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:46:49.651264 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0e08a6-353b-4ea7-a22c-12a3a3f80932" path="/var/lib/kubelet/pods/9d0e08a6-353b-4ea7-a22c-12a3a3f80932/volumes" Apr 24 16:47:12.072766 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.072732 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8"] Apr 24 16:47:12.073143 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.073044 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" containerID="cri-o://3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa" gracePeriod=30 Apr 24 16:47:12.073228 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.073147 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kube-rbac-proxy" containerID="cri-o://82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7" gracePeriod=30 Apr 24 16:47:12.147194 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.147165 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5"] Apr 24 16:47:12.147444 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.147422 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" containerID="cri-o://1b473988aa85e15f65dd76b249f923f3195fe9141fa84a2db0bad1c87578b071" gracePeriod=30 Apr 24 16:47:12.147505 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.147456 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kube-rbac-proxy" containerID="cri-o://0515c2acbfc2772445f360c21eccf8812e10602d69c9f401d4498fd678780208" gracePeriod=30 Apr 24 16:47:12.165036 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.165012 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4"] Apr 24 16:47:12.165352 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.165339 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d0e08a6-353b-4ea7-a22c-12a3a3f80932" containerName="console" Apr 24 16:47:12.165352 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.165353 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0e08a6-353b-4ea7-a22c-12a3a3f80932" containerName="console" Apr 24 16:47:12.165451 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.165416 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d0e08a6-353b-4ea7-a22c-12a3a3f80932" containerName="console" Apr 24 16:47:12.167342 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.167327 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.170427 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.170408 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2c4dd-predictor-serving-cert\"" Apr 24 16:47:12.171173 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.171109 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\"" Apr 24 16:47:12.183391 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.183367 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4"] Apr 24 16:47:12.243318 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.243284 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt"] Apr 24 16:47:12.253869 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.253665 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.257146 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.256327 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2c4dd-predictor-serving-cert\"" Apr 24 16:47:12.257146 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.256613 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\"" Apr 24 16:47:12.257314 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.257276 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt"] Apr 24 16:47:12.270558 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.270535 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn6br\" (UniqueName: \"kubernetes.io/projected/0352dfc6-f89c-4592-899c-58affbababb4-kube-api-access-zn6br\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.270669 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.270584 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0352dfc6-f89c-4592-899c-58affbababb4-proxy-tls\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.270669 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.270625 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0352dfc6-f89c-4592-899c-58affbababb4-success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.328897 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.328828 2561 generic.go:358] "Generic (PLEG): container finished" podID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerID="0515c2acbfc2772445f360c21eccf8812e10602d69c9f401d4498fd678780208" exitCode=2 Apr 24 16:47:12.328897 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.328860 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" event={"ID":"3a5f72e4-d8de-42d7-9078-2fa28a50203d","Type":"ContainerDied","Data":"0515c2acbfc2772445f360c21eccf8812e10602d69c9f401d4498fd678780208"} Apr 24 16:47:12.330504 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.330482 2561 generic.go:358] "Generic (PLEG): container finished" podID="9b01fef6-6066-43b8-a153-e14d7d125268" containerID="82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7" exitCode=2 Apr 24 16:47:12.330612 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.330511 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" event={"ID":"9b01fef6-6066-43b8-a153-e14d7d125268","Type":"ContainerDied","Data":"82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7"} Apr 24 16:47:12.371953 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.371930 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0352dfc6-f89c-4592-899c-58affbababb4-success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.372092 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.371972 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.372092 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.372010 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d22fadcc-27c7-47ce-8cd4-29c8fda89621-error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.372237 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.372157 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmhqc\" (UniqueName: \"kubernetes.io/projected/d22fadcc-27c7-47ce-8cd4-29c8fda89621-kube-api-access-pmhqc\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.372237 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.372206 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zn6br\" (UniqueName: \"kubernetes.io/projected/0352dfc6-f89c-4592-899c-58affbababb4-kube-api-access-zn6br\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.372327 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.372266 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0352dfc6-f89c-4592-899c-58affbababb4-proxy-tls\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.372645 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.372625 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0352dfc6-f89c-4592-899c-58affbababb4-success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.374688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.374665 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0352dfc6-f89c-4592-899c-58affbababb4-proxy-tls\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.392055 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.392025 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn6br\" (UniqueName: \"kubernetes.io/projected/0352dfc6-f89c-4592-899c-58affbababb4-kube-api-access-zn6br\") pod \"success-200-isvc-2c4dd-predictor-587bf956c8-px6z4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.472974 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.472945 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.473090 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.472995 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d22fadcc-27c7-47ce-8cd4-29c8fda89621-error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.473090 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.473043 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmhqc\" (UniqueName: \"kubernetes.io/projected/d22fadcc-27c7-47ce-8cd4-29c8fda89621-kube-api-access-pmhqc\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.473230 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:47:12.473098 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-serving-cert: secret "error-404-isvc-2c4dd-predictor-serving-cert" not found Apr 24 16:47:12.473230 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:47:12.473183 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls podName:d22fadcc-27c7-47ce-8cd4-29c8fda89621 nodeName:}" failed. No retries permitted until 2026-04-24 16:47:12.973162328 +0000 UTC m=+501.826033026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls") pod "error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" (UID: "d22fadcc-27c7-47ce-8cd4-29c8fda89621") : secret "error-404-isvc-2c4dd-predictor-serving-cert" not found Apr 24 16:47:12.473709 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.473686 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d22fadcc-27c7-47ce-8cd4-29c8fda89621-error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.478808 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.478786 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:12.490135 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.490100 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmhqc\" (UniqueName: \"kubernetes.io/projected/d22fadcc-27c7-47ce-8cd4-29c8fda89621-kube-api-access-pmhqc\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.599975 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.599907 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4"] Apr 24 16:47:12.603296 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:47:12.603269 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0352dfc6_f89c_4592_899c_58affbababb4.slice/crio-d2ee31dde8893cc9bda929da775bcb0ec2d34e68b1c1a5670d8185e67c8762c1 WatchSource:0}: Error finding container d2ee31dde8893cc9bda929da775bcb0ec2d34e68b1c1a5670d8185e67c8762c1: Status 404 returned error can't find the container with id d2ee31dde8893cc9bda929da775bcb0ec2d34e68b1c1a5670d8185e67c8762c1 Apr 24 16:47:12.978046 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.978012 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:12.980228 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:12.980211 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls\") pod \"error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:13.100272 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.100235 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.29:8643/healthz\": dial tcp 10.133.0.29:8643: connect: connection refused" Apr 24 16:47:13.100604 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.100236 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 24 16:47:13.105073 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.105051 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 24 16:47:13.105152 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.105081 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 24 16:47:13.167327 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.167292 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:13.286792 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.286760 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt"] Apr 24 16:47:13.290032 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:47:13.290005 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd22fadcc_27c7_47ce_8cd4_29c8fda89621.slice/crio-f4aafa37e57a89f4f845d70684d0427597ab8dd18734a3c1119e13ecae175c38 WatchSource:0}: Error finding container f4aafa37e57a89f4f845d70684d0427597ab8dd18734a3c1119e13ecae175c38: Status 404 returned error can't find the container with id f4aafa37e57a89f4f845d70684d0427597ab8dd18734a3c1119e13ecae175c38 Apr 24 16:47:13.334342 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.334315 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" event={"ID":"d22fadcc-27c7-47ce-8cd4-29c8fda89621","Type":"ContainerStarted","Data":"f4aafa37e57a89f4f845d70684d0427597ab8dd18734a3c1119e13ecae175c38"} Apr 24 16:47:13.335780 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.335753 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" event={"ID":"0352dfc6-f89c-4592-899c-58affbababb4","Type":"ContainerStarted","Data":"ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d"} Apr 24 16:47:13.335870 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.335789 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" event={"ID":"0352dfc6-f89c-4592-899c-58affbababb4","Type":"ContainerStarted","Data":"1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb"} Apr 24 16:47:13.335870 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.335805 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" event={"ID":"0352dfc6-f89c-4592-899c-58affbababb4","Type":"ContainerStarted","Data":"d2ee31dde8893cc9bda929da775bcb0ec2d34e68b1c1a5670d8185e67c8762c1"} Apr 24 16:47:13.335870 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.335837 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:13.352654 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:13.352441 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podStartSLOduration=1.352425013 podStartE2EDuration="1.352425013s" podCreationTimestamp="2026-04-24 16:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:47:13.35198635 +0000 UTC m=+502.204857064" watchObservedRunningTime="2026-04-24 16:47:13.352425013 +0000 UTC m=+502.205295727" Apr 24 16:47:14.340771 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:14.340741 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" event={"ID":"d22fadcc-27c7-47ce-8cd4-29c8fda89621","Type":"ContainerStarted","Data":"a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213"} Apr 24 16:47:14.340771 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:14.340774 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" event={"ID":"d22fadcc-27c7-47ce-8cd4-29c8fda89621","Type":"ContainerStarted","Data":"2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a"} Apr 24 16:47:14.341263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:14.341141 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:14.342481 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:14.342458 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 16:47:14.382184 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:14.382140 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podStartSLOduration=2.382111217 podStartE2EDuration="2.382111217s" podCreationTimestamp="2026-04-24 16:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:47:14.381552893 +0000 UTC m=+503.234423606" watchObservedRunningTime="2026-04-24 16:47:14.382111217 +0000 UTC m=+503.234981958" Apr 24 16:47:15.119941 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.119921 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:47:15.298170 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.298136 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b01fef6-6066-43b8-a153-e14d7d125268-success-200-isvc-32209-kube-rbac-proxy-sar-config\") pod \"9b01fef6-6066-43b8-a153-e14d7d125268\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " Apr 24 16:47:15.298314 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.298193 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpr5j\" (UniqueName: \"kubernetes.io/projected/9b01fef6-6066-43b8-a153-e14d7d125268-kube-api-access-bpr5j\") pod \"9b01fef6-6066-43b8-a153-e14d7d125268\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " Apr 24 16:47:15.298314 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.298235 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls\") pod \"9b01fef6-6066-43b8-a153-e14d7d125268\" (UID: \"9b01fef6-6066-43b8-a153-e14d7d125268\") " Apr 24 16:47:15.298504 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.298481 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b01fef6-6066-43b8-a153-e14d7d125268-success-200-isvc-32209-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-32209-kube-rbac-proxy-sar-config") pod "9b01fef6-6066-43b8-a153-e14d7d125268" (UID: "9b01fef6-6066-43b8-a153-e14d7d125268"). InnerVolumeSpecName "success-200-isvc-32209-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:47:15.300340 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.300317 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b01fef6-6066-43b8-a153-e14d7d125268-kube-api-access-bpr5j" (OuterVolumeSpecName: "kube-api-access-bpr5j") pod "9b01fef6-6066-43b8-a153-e14d7d125268" (UID: "9b01fef6-6066-43b8-a153-e14d7d125268"). InnerVolumeSpecName "kube-api-access-bpr5j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:47:15.300340 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.300324 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9b01fef6-6066-43b8-a153-e14d7d125268" (UID: "9b01fef6-6066-43b8-a153-e14d7d125268"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:47:15.345189 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.345162 2561 generic.go:358] "Generic (PLEG): container finished" podID="9b01fef6-6066-43b8-a153-e14d7d125268" containerID="3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa" exitCode=0 Apr 24 16:47:15.345596 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.345236 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" Apr 24 16:47:15.345596 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.345250 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" event={"ID":"9b01fef6-6066-43b8-a153-e14d7d125268","Type":"ContainerDied","Data":"3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa"} Apr 24 16:47:15.345596 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.345291 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8" event={"ID":"9b01fef6-6066-43b8-a153-e14d7d125268","Type":"ContainerDied","Data":"e59c06c20da534feeb825cc8dd426cea432b79b0c2ff890702ec124c88f59f4e"} Apr 24 16:47:15.345596 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.345311 2561 scope.go:117] "RemoveContainer" containerID="82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7" Apr 24 16:47:15.346928 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.346908 2561 generic.go:358] "Generic (PLEG): container finished" podID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerID="1b473988aa85e15f65dd76b249f923f3195fe9141fa84a2db0bad1c87578b071" exitCode=0 Apr 24 16:47:15.347024 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.346976 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" event={"ID":"3a5f72e4-d8de-42d7-9078-2fa28a50203d","Type":"ContainerDied","Data":"1b473988aa85e15f65dd76b249f923f3195fe9141fa84a2db0bad1c87578b071"} Apr 24 16:47:15.347522 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.347477 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 16:47:15.347690 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.347672 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:15.347787 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.347708 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:15.351100 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.349077 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 16:47:15.358111 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.358069 2561 scope.go:117] "RemoveContainer" containerID="3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa" Apr 24 16:47:15.365620 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.365601 2561 scope.go:117] "RemoveContainer" containerID="82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7" Apr 24 16:47:15.365885 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:47:15.365860 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7\": container with ID starting with 82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7 not found: ID does not exist" containerID="82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7" Apr 24 16:47:15.365925 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.365897 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7"} err="failed to get container status \"82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7\": rpc error: code = NotFound desc = could not find container \"82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7\": container with ID starting with 82eb39872b2e9d3d1de30425b840e4bd8523b97952308dab67a35393ca09c3f7 not found: ID does not exist" Apr 24 16:47:15.365925 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.365920 2561 scope.go:117] "RemoveContainer" containerID="3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa" Apr 24 16:47:15.366189 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:47:15.366166 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa\": container with ID starting with 3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa not found: ID does not exist" containerID="3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa" Apr 24 16:47:15.366269 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.366193 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa"} err="failed to get container status \"3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa\": rpc error: code = NotFound desc = could not find container \"3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa\": container with ID starting with 3d82def966813d1ccb3c7f1a3cbc1b88a1b7149093cfcb92cb021116598640aa not found: ID does not exist" Apr 24 16:47:15.376019 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.375992 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8"] Apr 24 16:47:15.381109 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.381086 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32209-predictor-649bfcd444-vqfb8"] Apr 24 16:47:15.391027 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.391011 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:47:15.399424 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.399408 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpr5j\" (UniqueName: \"kubernetes.io/projected/9b01fef6-6066-43b8-a153-e14d7d125268-kube-api-access-bpr5j\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:47:15.399499 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.399426 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b01fef6-6066-43b8-a153-e14d7d125268-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:47:15.399499 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.399438 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b01fef6-6066-43b8-a153-e14d7d125268-success-200-isvc-32209-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:47:15.500327 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.500289 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls\") pod \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " Apr 24 16:47:15.500506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.500347 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a5f72e4-d8de-42d7-9078-2fa28a50203d-error-404-isvc-32209-kube-rbac-proxy-sar-config\") pod \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " Apr 24 16:47:15.500506 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.500404 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtzgj\" (UniqueName: \"kubernetes.io/projected/3a5f72e4-d8de-42d7-9078-2fa28a50203d-kube-api-access-mtzgj\") pod \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\" (UID: \"3a5f72e4-d8de-42d7-9078-2fa28a50203d\") " Apr 24 16:47:15.500952 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.500915 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a5f72e4-d8de-42d7-9078-2fa28a50203d-error-404-isvc-32209-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-32209-kube-rbac-proxy-sar-config") pod "3a5f72e4-d8de-42d7-9078-2fa28a50203d" (UID: "3a5f72e4-d8de-42d7-9078-2fa28a50203d"). InnerVolumeSpecName "error-404-isvc-32209-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:47:15.502430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.502406 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5f72e4-d8de-42d7-9078-2fa28a50203d-kube-api-access-mtzgj" (OuterVolumeSpecName: "kube-api-access-mtzgj") pod "3a5f72e4-d8de-42d7-9078-2fa28a50203d" (UID: "3a5f72e4-d8de-42d7-9078-2fa28a50203d"). InnerVolumeSpecName "kube-api-access-mtzgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:47:15.502483 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.502430 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3a5f72e4-d8de-42d7-9078-2fa28a50203d" (UID: "3a5f72e4-d8de-42d7-9078-2fa28a50203d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:47:15.601601 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.601532 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtzgj\" (UniqueName: \"kubernetes.io/projected/3a5f72e4-d8de-42d7-9078-2fa28a50203d-kube-api-access-mtzgj\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:47:15.601601 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.601556 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a5f72e4-d8de-42d7-9078-2fa28a50203d-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:47:15.601601 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.601568 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-32209-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3a5f72e4-d8de-42d7-9078-2fa28a50203d-error-404-isvc-32209-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:47:15.651918 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:15.651887 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" path="/var/lib/kubelet/pods/9b01fef6-6066-43b8-a153-e14d7d125268/volumes" Apr 24 16:47:16.352215 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:16.352187 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" Apr 24 16:47:16.352632 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:16.352186 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5" event={"ID":"3a5f72e4-d8de-42d7-9078-2fa28a50203d","Type":"ContainerDied","Data":"d8e2309920d3b0867047951b40707662a11620a121e51cfed3ccdf1038be6379"} Apr 24 16:47:16.352632 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:16.352308 2561 scope.go:117] "RemoveContainer" containerID="0515c2acbfc2772445f360c21eccf8812e10602d69c9f401d4498fd678780208" Apr 24 16:47:16.352968 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:16.352940 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 16:47:16.359924 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:16.359905 2561 scope.go:117] "RemoveContainer" containerID="1b473988aa85e15f65dd76b249f923f3195fe9141fa84a2db0bad1c87578b071" Apr 24 16:47:16.373586 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:16.373560 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5"] Apr 24 16:47:16.378755 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:16.378727 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-32209-predictor-9c8dc8f6c-5r4r5"] Apr 24 16:47:17.653538 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:17.653498 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" path="/var/lib/kubelet/pods/3a5f72e4-d8de-42d7-9078-2fa28a50203d/volumes" Apr 24 16:47:20.351955 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:20.351929 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:47:20.352464 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:20.352442 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 16:47:21.356226 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:21.356196 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:47:21.356741 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:21.356715 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 16:47:30.352754 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:30.352717 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 16:47:31.357284 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:31.357240 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 16:47:40.353305 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:40.353271 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 16:47:41.357071 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:41.357034 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 16:47:48.012201 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012167 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f"] Apr 24 16:47:48.012579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012501 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kube-rbac-proxy" Apr 24 16:47:48.012579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012512 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kube-rbac-proxy" Apr 24 16:47:48.012579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012522 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" Apr 24 16:47:48.012579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012527 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" Apr 24 16:47:48.012579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012535 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" Apr 24 16:47:48.012579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012542 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" Apr 24 16:47:48.012579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012561 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kube-rbac-proxy" Apr 24 16:47:48.012579 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012566 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kube-rbac-proxy" Apr 24 16:47:48.012813 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012621 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kube-rbac-proxy" Apr 24 16:47:48.012813 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012632 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kserve-container" Apr 24 16:47:48.012813 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012637 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a5f72e4-d8de-42d7-9078-2fa28a50203d" containerName="kube-rbac-proxy" Apr 24 16:47:48.012813 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.012644 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b01fef6-6066-43b8-a153-e14d7d125268" containerName="kserve-container" Apr 24 16:47:48.015897 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.015877 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.018432 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.018410 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-30dd6-predictor-serving-cert\"" Apr 24 16:47:48.018432 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.018422 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-30dd6-kube-rbac-proxy-sar-config\"" Apr 24 16:47:48.024560 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.024519 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f"] Apr 24 16:47:48.043418 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.043397 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l8wj\" (UniqueName: \"kubernetes.io/projected/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-kube-api-access-4l8wj\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.043525 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.043435 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.043525 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.043456 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-success-200-isvc-30dd6-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.113825 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.113793 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r"] Apr 24 16:47:48.123751 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.123723 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.128624 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.128501 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-30dd6-kube-rbac-proxy-sar-config\"" Apr 24 16:47:48.128761 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.128514 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-30dd6-predictor-serving-cert\"" Apr 24 16:47:48.128761 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.128585 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r"] Apr 24 16:47:48.143982 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.143952 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8a1736d-f782-4dea-9285-96f0b6062300-error-404-isvc-30dd6-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.144150 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.143989 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbhgk\" (UniqueName: \"kubernetes.io/projected/a8a1736d-f782-4dea-9285-96f0b6062300-kube-api-access-mbhgk\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.144150 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.144013 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l8wj\" (UniqueName: \"kubernetes.io/projected/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-kube-api-access-4l8wj\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.144263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.144179 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.144263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.144209 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.144263 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.144239 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-success-200-isvc-30dd6-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.144425 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:47:48.144380 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-serving-cert: secret "success-200-isvc-30dd6-predictor-serving-cert" not found Apr 24 16:47:48.144482 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:47:48.144453 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls podName:b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0 nodeName:}" failed. No retries permitted until 2026-04-24 16:47:48.644431925 +0000 UTC m=+537.497302616 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls") pod "success-200-isvc-30dd6-predictor-94645b55d-96c4f" (UID: "b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0") : secret "success-200-isvc-30dd6-predictor-serving-cert" not found Apr 24 16:47:48.145045 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.145020 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-success-200-isvc-30dd6-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.155547 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.155512 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l8wj\" (UniqueName: \"kubernetes.io/projected/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-kube-api-access-4l8wj\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.244864 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.244821 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.245052 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.244912 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8a1736d-f782-4dea-9285-96f0b6062300-error-404-isvc-30dd6-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.245052 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.244961 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbhgk\" (UniqueName: \"kubernetes.io/projected/a8a1736d-f782-4dea-9285-96f0b6062300-kube-api-access-mbhgk\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.245052 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:47:48.244991 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-serving-cert: secret "error-404-isvc-30dd6-predictor-serving-cert" not found Apr 24 16:47:48.245250 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:47:48.245081 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls podName:a8a1736d-f782-4dea-9285-96f0b6062300 nodeName:}" failed. No retries permitted until 2026-04-24 16:47:48.745060634 +0000 UTC m=+537.597931328 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls") pod "error-404-isvc-30dd6-predictor-79f6568b65-kq94r" (UID: "a8a1736d-f782-4dea-9285-96f0b6062300") : secret "error-404-isvc-30dd6-predictor-serving-cert" not found Apr 24 16:47:48.245692 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.245672 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8a1736d-f782-4dea-9285-96f0b6062300-error-404-isvc-30dd6-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.256619 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.256592 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbhgk\" (UniqueName: \"kubernetes.io/projected/a8a1736d-f782-4dea-9285-96f0b6062300-kube-api-access-mbhgk\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.648723 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.648691 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.650943 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.650916 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls\") pod \"success-200-isvc-30dd6-predictor-94645b55d-96c4f\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:48.750127 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.750094 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.752335 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.752316 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls\") pod \"error-404-isvc-30dd6-predictor-79f6568b65-kq94r\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:48.927770 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:48.927691 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:49.036128 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.036086 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:49.045800 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.045779 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f"] Apr 24 16:47:49.047863 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:47:49.047839 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b245aa_cb3e_44ab_8c6e_8d8cf0d3a0c0.slice/crio-55231d3618229f8c0ebb0047bf2b884bf0dfdb96d4093478511e399c775d6a37 WatchSource:0}: Error finding container 55231d3618229f8c0ebb0047bf2b884bf0dfdb96d4093478511e399c775d6a37: Status 404 returned error can't find the container with id 55231d3618229f8c0ebb0047bf2b884bf0dfdb96d4093478511e399c775d6a37 Apr 24 16:47:49.175933 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.175905 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r"] Apr 24 16:47:49.178614 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:47:49.178562 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a1736d_f782_4dea_9285_96f0b6062300.slice/crio-0e57a2982f89096707b0c2898989804e40928cb0de7dba3cda032e500a2e0644 WatchSource:0}: Error finding container 0e57a2982f89096707b0c2898989804e40928cb0de7dba3cda032e500a2e0644: Status 404 returned error can't find the container with id 0e57a2982f89096707b0c2898989804e40928cb0de7dba3cda032e500a2e0644 Apr 24 16:47:49.456422 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.456383 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" event={"ID":"a8a1736d-f782-4dea-9285-96f0b6062300","Type":"ContainerStarted","Data":"b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01"} Apr 24 16:47:49.456422 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.456423 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" event={"ID":"a8a1736d-f782-4dea-9285-96f0b6062300","Type":"ContainerStarted","Data":"5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572"} Apr 24 16:47:49.456648 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.456437 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" event={"ID":"a8a1736d-f782-4dea-9285-96f0b6062300","Type":"ContainerStarted","Data":"0e57a2982f89096707b0c2898989804e40928cb0de7dba3cda032e500a2e0644"} Apr 24 16:47:49.456648 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.456503 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:49.457890 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.457865 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" event={"ID":"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0","Type":"ContainerStarted","Data":"f58e4c96aaaa4af5448f92cecfc232b460249299182c196c20bf9e9ef9c6626d"} Apr 24 16:47:49.457890 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.457891 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" event={"ID":"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0","Type":"ContainerStarted","Data":"96df00680669d85c98ed48d50bd23062727a0bb92b503e853ddd9061ad98cfbb"} Apr 24 16:47:49.458049 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.457901 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" event={"ID":"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0","Type":"ContainerStarted","Data":"55231d3618229f8c0ebb0047bf2b884bf0dfdb96d4093478511e399c775d6a37"} Apr 24 16:47:49.458049 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.457994 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:49.479098 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.479053 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podStartSLOduration=1.479039355 podStartE2EDuration="1.479039355s" podCreationTimestamp="2026-04-24 16:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:47:49.477101847 +0000 UTC m=+538.329972584" watchObservedRunningTime="2026-04-24 16:47:49.479039355 +0000 UTC m=+538.331910068" Apr 24 16:47:49.496237 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:49.496194 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podStartSLOduration=2.496181311 podStartE2EDuration="2.496181311s" podCreationTimestamp="2026-04-24 16:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:47:49.495513058 +0000 UTC m=+538.348383799" watchObservedRunningTime="2026-04-24 16:47:49.496181311 +0000 UTC m=+538.349052023" Apr 24 16:47:50.352743 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:50.352703 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 24 16:47:50.461542 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:50.461511 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:50.461542 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:50.461548 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:50.462611 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:50.462583 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 16:47:50.462611 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:50.462599 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 16:47:51.357073 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:51.357036 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 24 16:47:51.464868 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:51.464830 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 16:47:51.465051 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:51.464912 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 16:47:56.469549 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:56.469517 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:47:56.470163 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:56.470136 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:47:56.470301 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:56.470173 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 16:47:56.470679 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:47:56.470648 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 16:48:00.353452 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:00.353421 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:48:01.357453 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:01.357428 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:48:06.470851 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:06.470814 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 16:48:06.471214 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:06.470814 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 16:48:16.470748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:16.470695 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 16:48:16.471245 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:16.470695 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 16:48:26.470381 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:26.470339 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 16:48:26.470949 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:26.470564 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 24 16:48:36.470888 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:36.470848 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:48:36.471352 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:48:36.471218 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:56:27.246344 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.246310 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4"] Apr 24 16:56:27.248666 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.246608 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kube-rbac-proxy" containerID="cri-o://ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d" gracePeriod=30 Apr 24 16:56:27.248666 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.246722 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" containerID="cri-o://1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb" gracePeriod=30 Apr 24 16:56:27.580060 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.579975 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj"] Apr 24 16:56:27.583275 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.583253 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.589503 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.589325 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f6171-kube-rbac-proxy-sar-config\"" Apr 24 16:56:27.589503 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.589381 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f6171-predictor-serving-cert\"" Apr 24 16:56:27.608590 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.608570 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6"] Apr 24 16:56:27.611832 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.611816 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.620192 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.620139 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f6171-kube-rbac-proxy-sar-config\"" Apr 24 16:56:27.622640 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.622618 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj"] Apr 24 16:56:27.626503 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.626481 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-f6171-predictor-serving-cert\"" Apr 24 16:56:27.644302 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.644279 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6"] Apr 24 16:56:27.674928 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.674901 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d7jj\" (UniqueName: \"kubernetes.io/projected/78bbb2de-6a30-45fd-ac95-13a9331a5f04-kube-api-access-6d7jj\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.675044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.674939 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba06523a-200e-4f01-8d61-6e865e8fb74f-proxy-tls\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.675044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.674961 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba06523a-200e-4f01-8d61-6e865e8fb74f-success-200-isvc-f6171-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.675044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.675003 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78bbb2de-6a30-45fd-ac95-13a9331a5f04-proxy-tls\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.675228 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.675092 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-967vg\" (UniqueName: \"kubernetes.io/projected/ba06523a-200e-4f01-8d61-6e865e8fb74f-kube-api-access-967vg\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.675228 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.675156 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78bbb2de-6a30-45fd-ac95-13a9331a5f04-error-404-isvc-f6171-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.756539 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.756511 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt"] Apr 24 16:56:27.756803 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.756780 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" containerID="cri-o://2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a" gracePeriod=30 Apr 24 16:56:27.756901 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.756807 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kube-rbac-proxy" containerID="cri-o://a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213" gracePeriod=30 Apr 24 16:56:27.776318 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.776295 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6d7jj\" (UniqueName: \"kubernetes.io/projected/78bbb2de-6a30-45fd-ac95-13a9331a5f04-kube-api-access-6d7jj\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.776440 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.776337 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba06523a-200e-4f01-8d61-6e865e8fb74f-proxy-tls\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.776440 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.776358 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba06523a-200e-4f01-8d61-6e865e8fb74f-success-200-isvc-f6171-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.776652 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.776616 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78bbb2de-6a30-45fd-ac95-13a9331a5f04-proxy-tls\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.776838 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.776814 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-967vg\" (UniqueName: \"kubernetes.io/projected/ba06523a-200e-4f01-8d61-6e865e8fb74f-kube-api-access-967vg\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.776958 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.776856 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78bbb2de-6a30-45fd-ac95-13a9331a5f04-error-404-isvc-f6171-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.777060 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.777036 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba06523a-200e-4f01-8d61-6e865e8fb74f-success-200-isvc-f6171-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.777672 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.777650 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78bbb2de-6a30-45fd-ac95-13a9331a5f04-error-404-isvc-f6171-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.778956 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.778937 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba06523a-200e-4f01-8d61-6e865e8fb74f-proxy-tls\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.779011 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.778937 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78bbb2de-6a30-45fd-ac95-13a9331a5f04-proxy-tls\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.787451 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.787425 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-967vg\" (UniqueName: \"kubernetes.io/projected/ba06523a-200e-4f01-8d61-6e865e8fb74f-kube-api-access-967vg\") pod \"success-200-isvc-f6171-predictor-778d4b8f54-pj5fj\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.790976 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.790951 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d7jj\" (UniqueName: \"kubernetes.io/projected/78bbb2de-6a30-45fd-ac95-13a9331a5f04-kube-api-access-6d7jj\") pod \"error-404-isvc-f6171-predictor-669d57f96f-22hp6\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:27.893044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.893018 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:27.921515 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:27.921488 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:28.034260 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:28.034238 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj"] Apr 24 16:56:28.035906 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:56:28.035869 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba06523a_200e_4f01_8d61_6e865e8fb74f.slice/crio-68da5e19b06315d5ae5ba5513231d737fb40f76973d48ec8ed2e073aeaa1eb7a WatchSource:0}: Error finding container 68da5e19b06315d5ae5ba5513231d737fb40f76973d48ec8ed2e073aeaa1eb7a: Status 404 returned error can't find the container with id 68da5e19b06315d5ae5ba5513231d737fb40f76973d48ec8ed2e073aeaa1eb7a Apr 24 16:56:28.037568 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:28.037554 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:56:28.055843 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:28.055732 2561 generic.go:358] "Generic (PLEG): container finished" podID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerID="a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213" exitCode=2 Apr 24 16:56:28.055843 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:28.055799 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" event={"ID":"d22fadcc-27c7-47ce-8cd4-29c8fda89621","Type":"ContainerDied","Data":"a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213"} Apr 24 16:56:28.055963 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:28.055918 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6"] Apr 24 16:56:28.056999 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:28.056971 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" event={"ID":"ba06523a-200e-4f01-8d61-6e865e8fb74f","Type":"ContainerStarted","Data":"68da5e19b06315d5ae5ba5513231d737fb40f76973d48ec8ed2e073aeaa1eb7a"} Apr 24 16:56:28.058532 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:28.058494 2561 generic.go:358] "Generic (PLEG): container finished" podID="0352dfc6-f89c-4592-899c-58affbababb4" containerID="ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d" exitCode=2 Apr 24 16:56:28.058532 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:28.058521 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" event={"ID":"0352dfc6-f89c-4592-899c-58affbababb4","Type":"ContainerDied","Data":"ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d"} Apr 24 16:56:28.069205 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:56:28.069181 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78bbb2de_6a30_45fd_ac95_13a9331a5f04.slice/crio-84c2c73c90414498754d413972669fe140a535ce9a234073746e8820ece7255f WatchSource:0}: Error finding container 84c2c73c90414498754d413972669fe140a535ce9a234073746e8820ece7255f: Status 404 returned error can't find the container with id 84c2c73c90414498754d413972669fe140a535ce9a234073746e8820ece7255f Apr 24 16:56:29.063560 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.063524 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" event={"ID":"78bbb2de-6a30-45fd-ac95-13a9331a5f04","Type":"ContainerStarted","Data":"e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232"} Apr 24 16:56:29.063560 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.063565 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" event={"ID":"78bbb2de-6a30-45fd-ac95-13a9331a5f04","Type":"ContainerStarted","Data":"616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2"} Apr 24 16:56:29.063996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.063575 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" event={"ID":"78bbb2de-6a30-45fd-ac95-13a9331a5f04","Type":"ContainerStarted","Data":"84c2c73c90414498754d413972669fe140a535ce9a234073746e8820ece7255f"} Apr 24 16:56:29.063996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.063618 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:29.063996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.063721 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:29.065079 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.065046 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 16:56:29.065277 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.065259 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" event={"ID":"ba06523a-200e-4f01-8d61-6e865e8fb74f","Type":"ContainerStarted","Data":"f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b"} Apr 24 16:56:29.065325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.065285 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" event={"ID":"ba06523a-200e-4f01-8d61-6e865e8fb74f","Type":"ContainerStarted","Data":"76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185"} Apr 24 16:56:29.065395 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.065385 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:29.065426 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.065399 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:29.066465 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.066436 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 16:56:29.099966 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.099924 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podStartSLOduration=2.099911016 podStartE2EDuration="2.099911016s" podCreationTimestamp="2026-04-24 16:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:56:29.098632127 +0000 UTC m=+1057.951502841" watchObservedRunningTime="2026-04-24 16:56:29.099911016 +0000 UTC m=+1057.952781725" Apr 24 16:56:29.134008 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:29.133966 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podStartSLOduration=2.133958098 podStartE2EDuration="2.133958098s" podCreationTimestamp="2026-04-24 16:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:56:29.130637532 +0000 UTC m=+1057.983508244" watchObservedRunningTime="2026-04-24 16:56:29.133958098 +0000 UTC m=+1057.986828807" Apr 24 16:56:30.069514 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.069460 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 16:56:30.070007 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.069468 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 16:56:30.289077 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.289056 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:56:30.400224 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.400139 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0352dfc6-f89c-4592-899c-58affbababb4-success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\") pod \"0352dfc6-f89c-4592-899c-58affbababb4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " Apr 24 16:56:30.400224 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.400182 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn6br\" (UniqueName: \"kubernetes.io/projected/0352dfc6-f89c-4592-899c-58affbababb4-kube-api-access-zn6br\") pod \"0352dfc6-f89c-4592-899c-58affbababb4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " Apr 24 16:56:30.400224 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.400225 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0352dfc6-f89c-4592-899c-58affbababb4-proxy-tls\") pod \"0352dfc6-f89c-4592-899c-58affbababb4\" (UID: \"0352dfc6-f89c-4592-899c-58affbababb4\") " Apr 24 16:56:30.400495 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.400467 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0352dfc6-f89c-4592-899c-58affbababb4-success-200-isvc-2c4dd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-2c4dd-kube-rbac-proxy-sar-config") pod "0352dfc6-f89c-4592-899c-58affbababb4" (UID: "0352dfc6-f89c-4592-899c-58affbababb4"). InnerVolumeSpecName "success-200-isvc-2c4dd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:56:30.402199 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.402172 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0352dfc6-f89c-4592-899c-58affbababb4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0352dfc6-f89c-4592-899c-58affbababb4" (UID: "0352dfc6-f89c-4592-899c-58affbababb4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:56:30.402199 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.402180 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0352dfc6-f89c-4592-899c-58affbababb4-kube-api-access-zn6br" (OuterVolumeSpecName: "kube-api-access-zn6br") pod "0352dfc6-f89c-4592-899c-58affbababb4" (UID: "0352dfc6-f89c-4592-899c-58affbababb4"). InnerVolumeSpecName "kube-api-access-zn6br". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:56:30.501379 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.501350 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0352dfc6-f89c-4592-899c-58affbababb4-success-200-isvc-2c4dd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:56:30.501379 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.501379 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zn6br\" (UniqueName: \"kubernetes.io/projected/0352dfc6-f89c-4592-899c-58affbababb4-kube-api-access-zn6br\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:56:30.501547 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.501389 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0352dfc6-f89c-4592-899c-58affbababb4-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:56:30.989025 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:30.989005 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:56:31.076812 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.076739 2561 generic.go:358] "Generic (PLEG): container finished" podID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerID="2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a" exitCode=0 Apr 24 16:56:31.077196 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.076820 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" Apr 24 16:56:31.077196 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.076827 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" event={"ID":"d22fadcc-27c7-47ce-8cd4-29c8fda89621","Type":"ContainerDied","Data":"2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a"} Apr 24 16:56:31.077196 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.076864 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt" event={"ID":"d22fadcc-27c7-47ce-8cd4-29c8fda89621","Type":"ContainerDied","Data":"f4aafa37e57a89f4f845d70684d0427597ab8dd18734a3c1119e13ecae175c38"} Apr 24 16:56:31.077196 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.076883 2561 scope.go:117] "RemoveContainer" containerID="a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213" Apr 24 16:56:31.078317 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.078293 2561 generic.go:358] "Generic (PLEG): container finished" podID="0352dfc6-f89c-4592-899c-58affbababb4" containerID="1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb" exitCode=0 Apr 24 16:56:31.078431 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.078348 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" event={"ID":"0352dfc6-f89c-4592-899c-58affbababb4","Type":"ContainerDied","Data":"1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb"} Apr 24 16:56:31.078431 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.078360 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" Apr 24 16:56:31.078431 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.078371 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4" event={"ID":"0352dfc6-f89c-4592-899c-58affbababb4","Type":"ContainerDied","Data":"d2ee31dde8893cc9bda929da775bcb0ec2d34e68b1c1a5670d8185e67c8762c1"} Apr 24 16:56:31.086974 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.086960 2561 scope.go:117] "RemoveContainer" containerID="2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a" Apr 24 16:56:31.093636 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.093619 2561 scope.go:117] "RemoveContainer" containerID="a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213" Apr 24 16:56:31.093850 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:56:31.093831 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213\": container with ID starting with a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213 not found: ID does not exist" containerID="a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213" Apr 24 16:56:31.093908 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.093858 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213"} err="failed to get container status \"a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213\": rpc error: code = NotFound desc = could not find container \"a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213\": container with ID starting with a56569614e1ab2bb609ae87c595be0cfc114b9f19383c98e4c093637e3fda213 not found: ID does not exist" Apr 24 16:56:31.093908 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.093873 2561 scope.go:117] "RemoveContainer" containerID="2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a" Apr 24 16:56:31.094099 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:56:31.094082 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a\": container with ID starting with 2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a not found: ID does not exist" containerID="2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a" Apr 24 16:56:31.094154 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.094105 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a"} err="failed to get container status \"2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a\": rpc error: code = NotFound desc = could not find container \"2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a\": container with ID starting with 2c53a66578a2cc69a293fecad00e99248b5529fe6ac2100f49473901bcf0590a not found: ID does not exist" Apr 24 16:56:31.094154 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.094141 2561 scope.go:117] "RemoveContainer" containerID="ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d" Apr 24 16:56:31.100284 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.100270 2561 scope.go:117] "RemoveContainer" containerID="1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb" Apr 24 16:56:31.106893 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.106834 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4"] Apr 24 16:56:31.106949 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.106920 2561 scope.go:117] "RemoveContainer" containerID="ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d" Apr 24 16:56:31.106985 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.106964 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d22fadcc-27c7-47ce-8cd4-29c8fda89621-error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\") pod \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " Apr 24 16:56:31.107049 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.107035 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls\") pod \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " Apr 24 16:56:31.107091 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.107080 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmhqc\" (UniqueName: \"kubernetes.io/projected/d22fadcc-27c7-47ce-8cd4-29c8fda89621-kube-api-access-pmhqc\") pod \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\" (UID: \"d22fadcc-27c7-47ce-8cd4-29c8fda89621\") " Apr 24 16:56:31.107246 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:56:31.107222 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d\": container with ID starting with ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d not found: ID does not exist" containerID="ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d" Apr 24 16:56:31.107325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.107257 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d"} err="failed to get container status \"ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d\": rpc error: code = NotFound desc = could not find container \"ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d\": container with ID starting with ee0717db7c1223089313b1712e5f1a88e7b724ef90071dacac9dcaee60520d5d not found: ID does not exist" Apr 24 16:56:31.107325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.107278 2561 scope.go:117] "RemoveContainer" containerID="1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb" Apr 24 16:56:31.107447 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.107338 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d22fadcc-27c7-47ce-8cd4-29c8fda89621-error-404-isvc-2c4dd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-2c4dd-kube-rbac-proxy-sar-config") pod "d22fadcc-27c7-47ce-8cd4-29c8fda89621" (UID: "d22fadcc-27c7-47ce-8cd4-29c8fda89621"). InnerVolumeSpecName "error-404-isvc-2c4dd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:56:31.107552 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:56:31.107532 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb\": container with ID starting with 1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb not found: ID does not exist" containerID="1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb" Apr 24 16:56:31.107614 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.107556 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb"} err="failed to get container status \"1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb\": rpc error: code = NotFound desc = could not find container \"1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb\": container with ID starting with 1e9a75f76faa3cd0098ee9f85cd68bc1958fe1ed54029b99006be817b4792feb not found: ID does not exist" Apr 24 16:56:31.108949 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.108928 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d22fadcc-27c7-47ce-8cd4-29c8fda89621-kube-api-access-pmhqc" (OuterVolumeSpecName: "kube-api-access-pmhqc") pod "d22fadcc-27c7-47ce-8cd4-29c8fda89621" (UID: "d22fadcc-27c7-47ce-8cd4-29c8fda89621"). InnerVolumeSpecName "kube-api-access-pmhqc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:56:31.109022 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.108973 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d22fadcc-27c7-47ce-8cd4-29c8fda89621" (UID: "d22fadcc-27c7-47ce-8cd4-29c8fda89621"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:56:31.112988 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.112966 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2c4dd-predictor-587bf956c8-px6z4"] Apr 24 16:56:31.207763 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.207741 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d22fadcc-27c7-47ce-8cd4-29c8fda89621-error-404-isvc-2c4dd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:56:31.207763 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.207760 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d22fadcc-27c7-47ce-8cd4-29c8fda89621-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:56:31.207882 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.207770 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pmhqc\" (UniqueName: \"kubernetes.io/projected/d22fadcc-27c7-47ce-8cd4-29c8fda89621-kube-api-access-pmhqc\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:56:31.410022 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.409994 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt"] Apr 24 16:56:31.422677 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.422651 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-2c4dd-predictor-bdc7b5877-vpdlt"] Apr 24 16:56:31.651431 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.651361 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0352dfc6-f89c-4592-899c-58affbababb4" path="/var/lib/kubelet/pods/0352dfc6-f89c-4592-899c-58affbababb4/volumes" Apr 24 16:56:31.651749 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:31.651738 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" path="/var/lib/kubelet/pods/d22fadcc-27c7-47ce-8cd4-29c8fda89621/volumes" Apr 24 16:56:35.074340 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:35.074306 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:56:35.074872 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:35.074836 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 16:56:35.075276 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:35.075257 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:56:35.075744 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:35.075723 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 16:56:45.075183 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:45.075086 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 16:56:45.075688 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:45.075663 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 16:56:55.075438 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:55.075399 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 16:56:55.075804 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:56:55.075646 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 16:57:02.970178 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:02.970141 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f"] Apr 24 16:57:02.970665 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:02.970428 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" containerID="cri-o://96df00680669d85c98ed48d50bd23062727a0bb92b503e853ddd9061ad98cfbb" gracePeriod=30 Apr 24 16:57:02.970665 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:02.970460 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kube-rbac-proxy" containerID="cri-o://f58e4c96aaaa4af5448f92cecfc232b460249299182c196c20bf9e9ef9c6626d" gracePeriod=30 Apr 24 16:57:03.007708 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.007684 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6"] Apr 24 16:57:03.008063 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008050 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" Apr 24 16:57:03.008104 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008066 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" Apr 24 16:57:03.008104 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008076 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kube-rbac-proxy" Apr 24 16:57:03.008104 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008081 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kube-rbac-proxy" Apr 24 16:57:03.008104 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008091 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" Apr 24 16:57:03.008104 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008096 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" Apr 24 16:57:03.008104 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008103 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kube-rbac-proxy" Apr 24 16:57:03.008295 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008108 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kube-rbac-proxy" Apr 24 16:57:03.008295 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008183 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kserve-container" Apr 24 16:57:03.008295 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008193 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="d22fadcc-27c7-47ce-8cd4-29c8fda89621" containerName="kube-rbac-proxy" Apr 24 16:57:03.008295 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008201 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kserve-container" Apr 24 16:57:03.008295 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.008207 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="0352dfc6-f89c-4592-899c-58affbababb4" containerName="kube-rbac-proxy" Apr 24 16:57:03.012830 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.012815 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.015220 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.015198 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-3eac5-predictor-serving-cert\"" Apr 24 16:57:03.015320 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.015300 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-3eac5-kube-rbac-proxy-sar-config\"" Apr 24 16:57:03.022226 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.022206 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6"] Apr 24 16:57:03.080597 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.080561 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r"] Apr 24 16:57:03.080951 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.080925 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" containerID="cri-o://5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572" gracePeriod=30 Apr 24 16:57:03.081017 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.080962 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kube-rbac-proxy" containerID="cri-o://b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01" gracePeriod=30 Apr 24 16:57:03.134424 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.134398 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv"] Apr 24 16:57:03.138498 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.138481 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.140933 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.140912 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-3eac5-kube-rbac-proxy-sar-config\"" Apr 24 16:57:03.141134 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.140943 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-3eac5-predictor-serving-cert\"" Apr 24 16:57:03.147644 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.147628 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv"] Apr 24 16:57:03.159515 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.159493 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvz8\" (UniqueName: \"kubernetes.io/projected/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-kube-api-access-ldvz8\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.159604 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.159538 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-success-200-isvc-3eac5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.159660 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.159636 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.183475 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.183452 2561 generic.go:358] "Generic (PLEG): container finished" podID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerID="f58e4c96aaaa4af5448f92cecfc232b460249299182c196c20bf9e9ef9c6626d" exitCode=2 Apr 24 16:57:03.183596 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.183516 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" event={"ID":"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0","Type":"ContainerDied","Data":"f58e4c96aaaa4af5448f92cecfc232b460249299182c196c20bf9e9ef9c6626d"} Apr 24 16:57:03.260965 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.260900 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvz8\" (UniqueName: \"kubernetes.io/projected/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-kube-api-access-ldvz8\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.260965 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.260941 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-success-200-isvc-3eac5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.260965 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.260963 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgsk\" (UniqueName: \"kubernetes.io/projected/92839f22-2bdc-4478-8460-1f9edbf864e9-kube-api-access-6cgsk\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.261212 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.261017 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.261212 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.261091 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92839f22-2bdc-4478-8460-1f9edbf864e9-error-404-isvc-3eac5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.261212 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:03.261131 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-serving-cert: secret "success-200-isvc-3eac5-predictor-serving-cert" not found Apr 24 16:57:03.261212 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.261171 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.261212 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:03.261201 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls podName:9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2 nodeName:}" failed. No retries permitted until 2026-04-24 16:57:03.761180408 +0000 UTC m=+1092.614051099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls") pod "success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" (UID: "9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2") : secret "success-200-isvc-3eac5-predictor-serving-cert" not found Apr 24 16:57:03.261586 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.261569 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-success-200-isvc-3eac5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.273707 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.273681 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvz8\" (UniqueName: \"kubernetes.io/projected/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-kube-api-access-ldvz8\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.362244 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.362216 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92839f22-2bdc-4478-8460-1f9edbf864e9-error-404-isvc-3eac5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.362382 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.362257 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.362382 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.362303 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgsk\" (UniqueName: \"kubernetes.io/projected/92839f22-2bdc-4478-8460-1f9edbf864e9-kube-api-access-6cgsk\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.362494 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:03.362431 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-serving-cert: secret "error-404-isvc-3eac5-predictor-serving-cert" not found Apr 24 16:57:03.362546 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:03.362498 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls podName:92839f22-2bdc-4478-8460-1f9edbf864e9 nodeName:}" failed. No retries permitted until 2026-04-24 16:57:03.862479515 +0000 UTC m=+1092.715350212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls") pod "error-404-isvc-3eac5-predictor-5466454db9-flmsv" (UID: "92839f22-2bdc-4478-8460-1f9edbf864e9") : secret "error-404-isvc-3eac5-predictor-serving-cert" not found Apr 24 16:57:03.362807 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.362789 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92839f22-2bdc-4478-8460-1f9edbf864e9-error-404-isvc-3eac5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.374563 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.374531 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgsk\" (UniqueName: \"kubernetes.io/projected/92839f22-2bdc-4478-8460-1f9edbf864e9-kube-api-access-6cgsk\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.764935 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.764895 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:03.765143 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:03.764999 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-serving-cert: secret "success-200-isvc-3eac5-predictor-serving-cert" not found Apr 24 16:57:03.765143 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:03.765073 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls podName:9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2 nodeName:}" failed. No retries permitted until 2026-04-24 16:57:04.765051823 +0000 UTC m=+1093.617922515 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls") pod "success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" (UID: "9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2") : secret "success-200-isvc-3eac5-predictor-serving-cert" not found Apr 24 16:57:03.865662 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.865633 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:03.873551 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:03.873500 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls\") pod \"error-404-isvc-3eac5-predictor-5466454db9-flmsv\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:04.050901 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.050831 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:04.173345 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.173089 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv"] Apr 24 16:57:04.176330 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:57:04.176291 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92839f22_2bdc_4478_8460_1f9edbf864e9.slice/crio-21a2118454f3c1ab47630e69522d27ac354cf60c631a4343b25cc40001122851 WatchSource:0}: Error finding container 21a2118454f3c1ab47630e69522d27ac354cf60c631a4343b25cc40001122851: Status 404 returned error can't find the container with id 21a2118454f3c1ab47630e69522d27ac354cf60c631a4343b25cc40001122851 Apr 24 16:57:04.188020 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.187996 2561 generic.go:358] "Generic (PLEG): container finished" podID="a8a1736d-f782-4dea-9285-96f0b6062300" containerID="b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01" exitCode=2 Apr 24 16:57:04.188135 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.188063 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" event={"ID":"a8a1736d-f782-4dea-9285-96f0b6062300","Type":"ContainerDied","Data":"b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01"} Apr 24 16:57:04.188999 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.188980 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" event={"ID":"92839f22-2bdc-4478-8460-1f9edbf864e9","Type":"ContainerStarted","Data":"21a2118454f3c1ab47630e69522d27ac354cf60c631a4343b25cc40001122851"} Apr 24 16:57:04.776813 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.776779 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:04.779089 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.779072 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls\") pod \"success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:04.824126 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.824100 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:04.947857 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:04.947828 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6"] Apr 24 16:57:04.950780 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:57:04.950756 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e8d6087_9ff2_4eb9_b90f_ae8f84bd9cc2.slice/crio-b513609b2db068efc79e971712dae468f7103dec6ecaaf14236c9a71e88b4cbb WatchSource:0}: Error finding container b513609b2db068efc79e971712dae468f7103dec6ecaaf14236c9a71e88b4cbb: Status 404 returned error can't find the container with id b513609b2db068efc79e971712dae468f7103dec6ecaaf14236c9a71e88b4cbb Apr 24 16:57:05.076135 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.075083 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 24 16:57:05.076497 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.076299 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 24 16:57:05.193671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.193632 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" event={"ID":"92839f22-2bdc-4478-8460-1f9edbf864e9","Type":"ContainerStarted","Data":"f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a"} Apr 24 16:57:05.193671 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.193671 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" event={"ID":"92839f22-2bdc-4478-8460-1f9edbf864e9","Type":"ContainerStarted","Data":"874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31"} Apr 24 16:57:05.193897 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.193736 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:05.195044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.195022 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" event={"ID":"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2","Type":"ContainerStarted","Data":"c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b"} Apr 24 16:57:05.195170 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.195048 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" event={"ID":"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2","Type":"ContainerStarted","Data":"9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237"} Apr 24 16:57:05.195170 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.195061 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" event={"ID":"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2","Type":"ContainerStarted","Data":"b513609b2db068efc79e971712dae468f7103dec6ecaaf14236c9a71e88b4cbb"} Apr 24 16:57:05.195287 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.195276 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:05.195318 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.195290 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:05.196623 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.196603 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 16:57:05.266092 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:05.266059 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podStartSLOduration=2.266046697 podStartE2EDuration="2.266046697s" podCreationTimestamp="2026-04-24 16:57:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:57:05.236421597 +0000 UTC m=+1094.089292310" watchObservedRunningTime="2026-04-24 16:57:05.266046697 +0000 UTC m=+1094.118917409" Apr 24 16:57:06.199360 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.199333 2561 generic.go:358] "Generic (PLEG): container finished" podID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerID="96df00680669d85c98ed48d50bd23062727a0bb92b503e853ddd9061ad98cfbb" exitCode=0 Apr 24 16:57:06.199677 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.199405 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" event={"ID":"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0","Type":"ContainerDied","Data":"96df00680669d85c98ed48d50bd23062727a0bb92b503e853ddd9061ad98cfbb"} Apr 24 16:57:06.199864 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.199837 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 16:57:06.200029 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.199997 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:06.201289 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.201262 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 16:57:06.339037 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.339017 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:57:06.361357 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.361311 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podStartSLOduration=4.361294219 podStartE2EDuration="4.361294219s" podCreationTimestamp="2026-04-24 16:57:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:57:05.268007953 +0000 UTC m=+1094.120878667" watchObservedRunningTime="2026-04-24 16:57:06.361294219 +0000 UTC m=+1095.214164933" Apr 24 16:57:06.465232 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.465200 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 24 16:57:06.470660 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.470623 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 24 16:57:06.497391 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.497370 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls\") pod \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " Apr 24 16:57:06.497521 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.497443 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-success-200-isvc-30dd6-kube-rbac-proxy-sar-config\") pod \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " Apr 24 16:57:06.497521 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.497501 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l8wj\" (UniqueName: \"kubernetes.io/projected/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-kube-api-access-4l8wj\") pod \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\" (UID: \"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0\") " Apr 24 16:57:06.497744 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.497709 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-success-200-isvc-30dd6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-30dd6-kube-rbac-proxy-sar-config") pod "b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" (UID: "b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0"). InnerVolumeSpecName "success-200-isvc-30dd6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:57:06.499430 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.499409 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-kube-api-access-4l8wj" (OuterVolumeSpecName: "kube-api-access-4l8wj") pod "b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" (UID: "b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0"). InnerVolumeSpecName "kube-api-access-4l8wj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:57:06.499554 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.499527 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" (UID: "b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:57:06.598003 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.597978 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-success-200-isvc-30dd6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:06.598003 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.598001 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4l8wj\" (UniqueName: \"kubernetes.io/projected/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-kube-api-access-4l8wj\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:06.598166 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.598011 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:06.812786 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:06.812766 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:57:07.000959 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.000881 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbhgk\" (UniqueName: \"kubernetes.io/projected/a8a1736d-f782-4dea-9285-96f0b6062300-kube-api-access-mbhgk\") pod \"a8a1736d-f782-4dea-9285-96f0b6062300\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " Apr 24 16:57:07.000959 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.000931 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls\") pod \"a8a1736d-f782-4dea-9285-96f0b6062300\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " Apr 24 16:57:07.001152 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.001068 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8a1736d-f782-4dea-9285-96f0b6062300-error-404-isvc-30dd6-kube-rbac-proxy-sar-config\") pod \"a8a1736d-f782-4dea-9285-96f0b6062300\" (UID: \"a8a1736d-f782-4dea-9285-96f0b6062300\") " Apr 24 16:57:07.001400 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.001373 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8a1736d-f782-4dea-9285-96f0b6062300-error-404-isvc-30dd6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-30dd6-kube-rbac-proxy-sar-config") pod "a8a1736d-f782-4dea-9285-96f0b6062300" (UID: "a8a1736d-f782-4dea-9285-96f0b6062300"). InnerVolumeSpecName "error-404-isvc-30dd6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:57:07.002873 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.002852 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a8a1736d-f782-4dea-9285-96f0b6062300" (UID: "a8a1736d-f782-4dea-9285-96f0b6062300"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:57:07.002936 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.002853 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a1736d-f782-4dea-9285-96f0b6062300-kube-api-access-mbhgk" (OuterVolumeSpecName: "kube-api-access-mbhgk") pod "a8a1736d-f782-4dea-9285-96f0b6062300" (UID: "a8a1736d-f782-4dea-9285-96f0b6062300"). InnerVolumeSpecName "kube-api-access-mbhgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:57:07.102525 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.102494 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-30dd6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a8a1736d-f782-4dea-9285-96f0b6062300-error-404-isvc-30dd6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:07.102525 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.102521 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbhgk\" (UniqueName: \"kubernetes.io/projected/a8a1736d-f782-4dea-9285-96f0b6062300-kube-api-access-mbhgk\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:07.102525 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.102530 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a8a1736d-f782-4dea-9285-96f0b6062300-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:07.204045 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.204023 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" Apr 24 16:57:07.204402 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.204013 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f" event={"ID":"b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0","Type":"ContainerDied","Data":"55231d3618229f8c0ebb0047bf2b884bf0dfdb96d4093478511e399c775d6a37"} Apr 24 16:57:07.204402 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.204157 2561 scope.go:117] "RemoveContainer" containerID="f58e4c96aaaa4af5448f92cecfc232b460249299182c196c20bf9e9ef9c6626d" Apr 24 16:57:07.205389 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.205299 2561 generic.go:358] "Generic (PLEG): container finished" podID="a8a1736d-f782-4dea-9285-96f0b6062300" containerID="5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572" exitCode=0 Apr 24 16:57:07.205389 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.205378 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" Apr 24 16:57:07.205498 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.205383 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" event={"ID":"a8a1736d-f782-4dea-9285-96f0b6062300","Type":"ContainerDied","Data":"5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572"} Apr 24 16:57:07.205498 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.205410 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r" event={"ID":"a8a1736d-f782-4dea-9285-96f0b6062300","Type":"ContainerDied","Data":"0e57a2982f89096707b0c2898989804e40928cb0de7dba3cda032e500a2e0644"} Apr 24 16:57:07.206007 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.205981 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 16:57:07.212210 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.212192 2561 scope.go:117] "RemoveContainer" containerID="96df00680669d85c98ed48d50bd23062727a0bb92b503e853ddd9061ad98cfbb" Apr 24 16:57:07.219714 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.219699 2561 scope.go:117] "RemoveContainer" containerID="b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01" Apr 24 16:57:07.226272 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.226257 2561 scope.go:117] "RemoveContainer" containerID="5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572" Apr 24 16:57:07.232477 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.232457 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f"] Apr 24 16:57:07.232905 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.232893 2561 scope.go:117] "RemoveContainer" containerID="b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01" Apr 24 16:57:07.233146 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:07.233110 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01\": container with ID starting with b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01 not found: ID does not exist" containerID="b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01" Apr 24 16:57:07.233185 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.233153 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01"} err="failed to get container status \"b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01\": rpc error: code = NotFound desc = could not find container \"b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01\": container with ID starting with b302960066107c29aeae15a3aa500ce451f194bdb002d64babd165df54b4db01 not found: ID does not exist" Apr 24 16:57:07.233185 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.233168 2561 scope.go:117] "RemoveContainer" containerID="5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572" Apr 24 16:57:07.233393 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:07.233376 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572\": container with ID starting with 5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572 not found: ID does not exist" containerID="5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572" Apr 24 16:57:07.233439 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.233399 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572"} err="failed to get container status \"5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572\": rpc error: code = NotFound desc = could not find container \"5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572\": container with ID starting with 5b37c8c346814500d9324c1164bc1ab1b23eb94c07ef039eb3bea4d5f575e572 not found: ID does not exist" Apr 24 16:57:07.237206 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.237189 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-30dd6-predictor-94645b55d-96c4f"] Apr 24 16:57:07.247256 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.247237 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r"] Apr 24 16:57:07.251043 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.250991 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-30dd6-predictor-79f6568b65-kq94r"] Apr 24 16:57:07.651279 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.651190 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" path="/var/lib/kubelet/pods/a8a1736d-f782-4dea-9285-96f0b6062300/volumes" Apr 24 16:57:07.651684 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:07.651664 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" path="/var/lib/kubelet/pods/b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0/volumes" Apr 24 16:57:11.204257 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:11.204230 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:11.204700 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:11.204676 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 16:57:12.209928 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:12.209897 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:12.210467 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:12.210442 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 16:57:15.075478 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:15.075450 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:57:15.076251 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:15.076233 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:57:21.205044 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:21.205005 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 16:57:22.210473 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:22.210433 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 16:57:31.204977 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:31.204936 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 16:57:32.210442 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:32.210403 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 16:57:37.468958 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.468917 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj"] Apr 24 16:57:37.469454 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.469300 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" containerID="cri-o://76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185" gracePeriod=30 Apr 24 16:57:37.470007 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.469638 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kube-rbac-proxy" containerID="cri-o://f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b" gracePeriod=30 Apr 24 16:57:37.513567 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513544 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb"] Apr 24 16:57:37.513893 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513882 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" Apr 24 16:57:37.513934 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513896 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" Apr 24 16:57:37.513934 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513914 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kube-rbac-proxy" Apr 24 16:57:37.513934 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513919 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kube-rbac-proxy" Apr 24 16:57:37.513934 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513925 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kube-rbac-proxy" Apr 24 16:57:37.513934 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513930 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kube-rbac-proxy" Apr 24 16:57:37.514084 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513942 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" Apr 24 16:57:37.514084 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.513947 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" Apr 24 16:57:37.514084 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.514006 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kserve-container" Apr 24 16:57:37.514084 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.514015 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8a1736d-f782-4dea-9285-96f0b6062300" containerName="kube-rbac-proxy" Apr 24 16:57:37.514084 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.514022 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kube-rbac-proxy" Apr 24 16:57:37.514084 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.514028 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b245aa-cb3e-44ab-8c6e-8d8cf0d3a0c0" containerName="kserve-container" Apr 24 16:57:37.517211 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.517196 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.521403 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.521384 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-510ff-predictor-serving-cert\"" Apr 24 16:57:37.521805 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.521788 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-510ff-kube-rbac-proxy-sar-config\"" Apr 24 16:57:37.545739 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.545715 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb"] Apr 24 16:57:37.590501 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.590476 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6"] Apr 24 16:57:37.590755 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.590730 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" containerID="cri-o://616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2" gracePeriod=30 Apr 24 16:57:37.590845 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.590743 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kube-rbac-proxy" containerID="cri-o://e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232" gracePeriod=30 Apr 24 16:57:37.626507 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.626478 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42vqb\" (UniqueName: \"kubernetes.io/projected/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-kube-api-access-42vqb\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.626621 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.626542 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-proxy-tls\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.626621 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.626585 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-success-200-isvc-510ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.666466 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.666439 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786"] Apr 24 16:57:37.670444 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.670420 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.673223 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.673196 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-510ff-kube-rbac-proxy-sar-config\"" Apr 24 16:57:37.673501 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.673484 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-510ff-predictor-serving-cert\"" Apr 24 16:57:37.684557 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.684535 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786"] Apr 24 16:57:37.727324 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.727269 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kksp\" (UniqueName: \"kubernetes.io/projected/28719115-6329-46e4-8521-9fbf569b1aca-kube-api-access-4kksp\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.727324 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.727313 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42vqb\" (UniqueName: \"kubernetes.io/projected/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-kube-api-access-42vqb\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.727518 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.727362 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-proxy-tls\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.727518 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.727386 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/28719115-6329-46e4-8521-9fbf569b1aca-error-404-isvc-510ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.727518 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.727419 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28719115-6329-46e4-8521-9fbf569b1aca-proxy-tls\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.727518 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.727458 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-success-200-isvc-510ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.728057 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.728035 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-success-200-isvc-510ff-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.729569 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.729547 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-proxy-tls\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.736833 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.736806 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42vqb\" (UniqueName: \"kubernetes.io/projected/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-kube-api-access-42vqb\") pod \"success-200-isvc-510ff-predictor-7d689d4d56-kfdtb\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.826686 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.826656 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:37.828490 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.828469 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kksp\" (UniqueName: \"kubernetes.io/projected/28719115-6329-46e4-8521-9fbf569b1aca-kube-api-access-4kksp\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.828554 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.828528 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/28719115-6329-46e4-8521-9fbf569b1aca-error-404-isvc-510ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.828591 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.828551 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28719115-6329-46e4-8521-9fbf569b1aca-proxy-tls\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.829205 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.829181 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/28719115-6329-46e4-8521-9fbf569b1aca-error-404-isvc-510ff-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.833416 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.833394 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28719115-6329-46e4-8521-9fbf569b1aca-proxy-tls\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.837861 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.837835 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kksp\" (UniqueName: \"kubernetes.io/projected/28719115-6329-46e4-8521-9fbf569b1aca-kube-api-access-4kksp\") pod \"error-404-isvc-510ff-predictor-7b675fd9f5-vq786\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:37.956904 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.956881 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb"] Apr 24 16:57:37.959099 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:57:37.959070 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b34bdd3_4fb5_4a3e_a5f0_5318dbf718a9.slice/crio-319fb09dd1b0fe5ade154118a5548568734290f5c39209b85ba421e17e4a5eae WatchSource:0}: Error finding container 319fb09dd1b0fe5ade154118a5548568734290f5c39209b85ba421e17e4a5eae: Status 404 returned error can't find the container with id 319fb09dd1b0fe5ade154118a5548568734290f5c39209b85ba421e17e4a5eae Apr 24 16:57:37.983420 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:37.983400 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:38.114837 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.114807 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786"] Apr 24 16:57:38.116855 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:57:38.116829 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28719115_6329_46e4_8521_9fbf569b1aca.slice/crio-47745d39c03be646cd2437bd94b2aa80e642492f2ab9b322cc836ff3b2cdd318 WatchSource:0}: Error finding container 47745d39c03be646cd2437bd94b2aa80e642492f2ab9b322cc836ff3b2cdd318: Status 404 returned error can't find the container with id 47745d39c03be646cd2437bd94b2aa80e642492f2ab9b322cc836ff3b2cdd318 Apr 24 16:57:38.313141 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.313042 2561 generic.go:358] "Generic (PLEG): container finished" podID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerID="e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232" exitCode=2 Apr 24 16:57:38.313141 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.313112 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" event={"ID":"78bbb2de-6a30-45fd-ac95-13a9331a5f04","Type":"ContainerDied","Data":"e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232"} Apr 24 16:57:38.314620 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.314596 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" event={"ID":"28719115-6329-46e4-8521-9fbf569b1aca","Type":"ContainerStarted","Data":"5c755b61f0fecd0a7ee6d363d3f19b0d92ccf06ed5ff26e915397bda3f110e02"} Apr 24 16:57:38.314750 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.314626 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" event={"ID":"28719115-6329-46e4-8521-9fbf569b1aca","Type":"ContainerStarted","Data":"21e1989363af2ca3e2b86ce0cf017d0d3c38c4551ed4a5c6228eb7a70db4acd7"} Apr 24 16:57:38.314750 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.314639 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" event={"ID":"28719115-6329-46e4-8521-9fbf569b1aca","Type":"ContainerStarted","Data":"47745d39c03be646cd2437bd94b2aa80e642492f2ab9b322cc836ff3b2cdd318"} Apr 24 16:57:38.314750 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.314658 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:38.315978 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.315957 2561 generic.go:358] "Generic (PLEG): container finished" podID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerID="f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b" exitCode=2 Apr 24 16:57:38.316064 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.316025 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" event={"ID":"ba06523a-200e-4f01-8d61-6e865e8fb74f","Type":"ContainerDied","Data":"f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b"} Apr 24 16:57:38.317314 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.317292 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" event={"ID":"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9","Type":"ContainerStarted","Data":"c8f9f94b74e365867f95e1307e2857adc2be13bba8878b4c1fafeeb68a732709"} Apr 24 16:57:38.317382 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.317319 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" event={"ID":"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9","Type":"ContainerStarted","Data":"935c98947ea71251ed14e0039b6841dbdb09e6259184d5b247182ce778f8f969"} Apr 24 16:57:38.317382 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.317328 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" event={"ID":"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9","Type":"ContainerStarted","Data":"319fb09dd1b0fe5ade154118a5548568734290f5c39209b85ba421e17e4a5eae"} Apr 24 16:57:38.317455 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.317437 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:38.338181 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.338143 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podStartSLOduration=1.338131935 podStartE2EDuration="1.338131935s" podCreationTimestamp="2026-04-24 16:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:57:38.336562612 +0000 UTC m=+1127.189433324" watchObservedRunningTime="2026-04-24 16:57:38.338131935 +0000 UTC m=+1127.191002800" Apr 24 16:57:38.357679 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:38.357642 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podStartSLOduration=1.357630208 podStartE2EDuration="1.357630208s" podCreationTimestamp="2026-04-24 16:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:57:38.356667265 +0000 UTC m=+1127.209537977" watchObservedRunningTime="2026-04-24 16:57:38.357630208 +0000 UTC m=+1127.210500926" Apr 24 16:57:39.326918 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:39.326876 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:39.326918 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:39.326923 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:39.327757 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:39.327732 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 16:57:39.327832 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:39.327734 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 16:57:40.070177 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:40.070130 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 24 16:57:40.070667 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:40.070136 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 24 16:57:40.330283 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:40.330196 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 16:57:40.330599 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:40.330280 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 16:57:41.154856 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.154835 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:57:41.204659 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.204630 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 24 16:57:41.208656 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.208639 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:57:41.257949 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.257900 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d7jj\" (UniqueName: \"kubernetes.io/projected/78bbb2de-6a30-45fd-ac95-13a9331a5f04-kube-api-access-6d7jj\") pod \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " Apr 24 16:57:41.257949 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.257936 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba06523a-200e-4f01-8d61-6e865e8fb74f-proxy-tls\") pod \"ba06523a-200e-4f01-8d61-6e865e8fb74f\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " Apr 24 16:57:41.258141 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.257987 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-967vg\" (UniqueName: \"kubernetes.io/projected/ba06523a-200e-4f01-8d61-6e865e8fb74f-kube-api-access-967vg\") pod \"ba06523a-200e-4f01-8d61-6e865e8fb74f\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " Apr 24 16:57:41.258141 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.258096 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78bbb2de-6a30-45fd-ac95-13a9331a5f04-error-404-isvc-f6171-kube-rbac-proxy-sar-config\") pod \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " Apr 24 16:57:41.258256 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.258162 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba06523a-200e-4f01-8d61-6e865e8fb74f-success-200-isvc-f6171-kube-rbac-proxy-sar-config\") pod \"ba06523a-200e-4f01-8d61-6e865e8fb74f\" (UID: \"ba06523a-200e-4f01-8d61-6e865e8fb74f\") " Apr 24 16:57:41.258256 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.258189 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78bbb2de-6a30-45fd-ac95-13a9331a5f04-proxy-tls\") pod \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\" (UID: \"78bbb2de-6a30-45fd-ac95-13a9331a5f04\") " Apr 24 16:57:41.258487 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.258451 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78bbb2de-6a30-45fd-ac95-13a9331a5f04-error-404-isvc-f6171-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-f6171-kube-rbac-proxy-sar-config") pod "78bbb2de-6a30-45fd-ac95-13a9331a5f04" (UID: "78bbb2de-6a30-45fd-ac95-13a9331a5f04"). InnerVolumeSpecName "error-404-isvc-f6171-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:57:41.258565 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.258517 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba06523a-200e-4f01-8d61-6e865e8fb74f-success-200-isvc-f6171-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-f6171-kube-rbac-proxy-sar-config") pod "ba06523a-200e-4f01-8d61-6e865e8fb74f" (UID: "ba06523a-200e-4f01-8d61-6e865e8fb74f"). InnerVolumeSpecName "success-200-isvc-f6171-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:57:41.259955 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.259935 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba06523a-200e-4f01-8d61-6e865e8fb74f-kube-api-access-967vg" (OuterVolumeSpecName: "kube-api-access-967vg") pod "ba06523a-200e-4f01-8d61-6e865e8fb74f" (UID: "ba06523a-200e-4f01-8d61-6e865e8fb74f"). InnerVolumeSpecName "kube-api-access-967vg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:57:41.260072 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.260055 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78bbb2de-6a30-45fd-ac95-13a9331a5f04-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "78bbb2de-6a30-45fd-ac95-13a9331a5f04" (UID: "78bbb2de-6a30-45fd-ac95-13a9331a5f04"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:57:41.260389 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.260368 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba06523a-200e-4f01-8d61-6e865e8fb74f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ba06523a-200e-4f01-8d61-6e865e8fb74f" (UID: "ba06523a-200e-4f01-8d61-6e865e8fb74f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:57:41.260389 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.260377 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78bbb2de-6a30-45fd-ac95-13a9331a5f04-kube-api-access-6d7jj" (OuterVolumeSpecName: "kube-api-access-6d7jj") pod "78bbb2de-6a30-45fd-ac95-13a9331a5f04" (UID: "78bbb2de-6a30-45fd-ac95-13a9331a5f04"). InnerVolumeSpecName "kube-api-access-6d7jj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:57:41.334513 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.334484 2561 generic.go:358] "Generic (PLEG): container finished" podID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerID="616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2" exitCode=0 Apr 24 16:57:41.334942 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.334563 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" Apr 24 16:57:41.334942 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.334566 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" event={"ID":"78bbb2de-6a30-45fd-ac95-13a9331a5f04","Type":"ContainerDied","Data":"616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2"} Apr 24 16:57:41.334942 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.334605 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6" event={"ID":"78bbb2de-6a30-45fd-ac95-13a9331a5f04","Type":"ContainerDied","Data":"84c2c73c90414498754d413972669fe140a535ce9a234073746e8820ece7255f"} Apr 24 16:57:41.334942 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.334627 2561 scope.go:117] "RemoveContainer" containerID="e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232" Apr 24 16:57:41.336038 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.336016 2561 generic.go:358] "Generic (PLEG): container finished" podID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerID="76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185" exitCode=0 Apr 24 16:57:41.336140 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.336073 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" event={"ID":"ba06523a-200e-4f01-8d61-6e865e8fb74f","Type":"ContainerDied","Data":"76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185"} Apr 24 16:57:41.336140 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.336096 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" event={"ID":"ba06523a-200e-4f01-8d61-6e865e8fb74f","Type":"ContainerDied","Data":"68da5e19b06315d5ae5ba5513231d737fb40f76973d48ec8ed2e073aeaa1eb7a"} Apr 24 16:57:41.336214 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.336141 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj" Apr 24 16:57:41.344543 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.344523 2561 scope.go:117] "RemoveContainer" containerID="616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2" Apr 24 16:57:41.351634 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.351620 2561 scope.go:117] "RemoveContainer" containerID="e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232" Apr 24 16:57:41.351856 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:41.351838 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232\": container with ID starting with e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232 not found: ID does not exist" containerID="e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232" Apr 24 16:57:41.351895 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.351864 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232"} err="failed to get container status \"e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232\": rpc error: code = NotFound desc = could not find container \"e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232\": container with ID starting with e0cbcaed5ff2a3ec75ce70c63c9ce18921c6963a807baac962c794372408f232 not found: ID does not exist" Apr 24 16:57:41.351895 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.351879 2561 scope.go:117] "RemoveContainer" containerID="616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2" Apr 24 16:57:41.352079 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:41.352065 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2\": container with ID starting with 616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2 not found: ID does not exist" containerID="616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2" Apr 24 16:57:41.352157 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.352083 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2"} err="failed to get container status \"616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2\": rpc error: code = NotFound desc = could not find container \"616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2\": container with ID starting with 616bc54742b57b5103a8e137273f38d8a7eda7feffd8d27a74f8e4f2637df5d2 not found: ID does not exist" Apr 24 16:57:41.352157 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.352096 2561 scope.go:117] "RemoveContainer" containerID="f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b" Apr 24 16:57:41.358360 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.358344 2561 scope.go:117] "RemoveContainer" containerID="76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185" Apr 24 16:57:41.358829 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.358806 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/78bbb2de-6a30-45fd-ac95-13a9331a5f04-error-404-isvc-f6171-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:41.358913 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.358834 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-f6171-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ba06523a-200e-4f01-8d61-6e865e8fb74f-success-200-isvc-f6171-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:41.358913 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.358847 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/78bbb2de-6a30-45fd-ac95-13a9331a5f04-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:41.358913 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.358857 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6d7jj\" (UniqueName: \"kubernetes.io/projected/78bbb2de-6a30-45fd-ac95-13a9331a5f04-kube-api-access-6d7jj\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:41.358913 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.358866 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba06523a-200e-4f01-8d61-6e865e8fb74f-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:41.358913 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.358874 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-967vg\" (UniqueName: \"kubernetes.io/projected/ba06523a-200e-4f01-8d61-6e865e8fb74f-kube-api-access-967vg\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:57:41.362936 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.362919 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj"] Apr 24 16:57:41.364508 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.364491 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6171-predictor-778d4b8f54-pj5fj"] Apr 24 16:57:41.365697 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.365678 2561 scope.go:117] "RemoveContainer" containerID="f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b" Apr 24 16:57:41.365927 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:41.365909 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b\": container with ID starting with f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b not found: ID does not exist" containerID="f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b" Apr 24 16:57:41.365996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.365930 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b"} err="failed to get container status \"f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b\": rpc error: code = NotFound desc = could not find container \"f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b\": container with ID starting with f8c6d5d282c477c33efdfcef4ba3370dadd8b219e03ae0543f481e4962110e0b not found: ID does not exist" Apr 24 16:57:41.365996 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.365945 2561 scope.go:117] "RemoveContainer" containerID="76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185" Apr 24 16:57:41.366240 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:57:41.366210 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185\": container with ID starting with 76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185 not found: ID does not exist" containerID="76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185" Apr 24 16:57:41.366329 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.366247 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185"} err="failed to get container status \"76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185\": rpc error: code = NotFound desc = could not find container \"76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185\": container with ID starting with 76beabac42033d96d9a1c57cc5e10feab7f0fbb40ba8e8aacdd6b7825f43b185 not found: ID does not exist" Apr 24 16:57:41.375381 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.375362 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6"] Apr 24 16:57:41.379256 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.379236 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-f6171-predictor-669d57f96f-22hp6"] Apr 24 16:57:41.653577 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.653497 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" path="/var/lib/kubelet/pods/78bbb2de-6a30-45fd-ac95-13a9331a5f04/volumes" Apr 24 16:57:41.653899 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:41.653886 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" path="/var/lib/kubelet/pods/ba06523a-200e-4f01-8d61-6e865e8fb74f/volumes" Apr 24 16:57:42.211039 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:42.211004 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 24 16:57:45.334384 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:45.334353 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:57:45.334840 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:45.334818 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:57:45.334948 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:45.334856 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 16:57:45.335220 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:45.335200 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 16:57:51.205896 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:51.205866 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:57:52.211768 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:52.211741 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:57:55.335747 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:55.335705 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 16:57:55.336146 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:57:55.335705 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 16:58:05.334960 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:05.334918 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 16:58:05.335402 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:05.335246 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 16:58:13.159796 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.159720 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv"] Apr 24 16:58:13.160251 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.159995 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" containerID="cri-o://874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31" gracePeriod=30 Apr 24 16:58:13.160251 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.160034 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kube-rbac-proxy" containerID="cri-o://f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a" gracePeriod=30 Apr 24 16:58:13.229495 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.229466 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6"] Apr 24 16:58:13.229765 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.229728 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" containerID="cri-o://9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237" gracePeriod=30 Apr 24 16:58:13.229845 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.229768 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kube-rbac-proxy" containerID="cri-o://c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b" gracePeriod=30 Apr 24 16:58:13.270950 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.270921 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx"] Apr 24 16:58:13.271435 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271418 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" Apr 24 16:58:13.271526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271436 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" Apr 24 16:58:13.271526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271459 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" Apr 24 16:58:13.271526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271467 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" Apr 24 16:58:13.271526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271482 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kube-rbac-proxy" Apr 24 16:58:13.271526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271491 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kube-rbac-proxy" Apr 24 16:58:13.271526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271508 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kube-rbac-proxy" Apr 24 16:58:13.271526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271516 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kube-rbac-proxy" Apr 24 16:58:13.271866 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271583 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kserve-container" Apr 24 16:58:13.271866 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271594 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kube-rbac-proxy" Apr 24 16:58:13.271866 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271606 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba06523a-200e-4f01-8d61-6e865e8fb74f" containerName="kserve-container" Apr 24 16:58:13.271866 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.271616 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="78bbb2de-6a30-45fd-ac95-13a9331a5f04" containerName="kube-rbac-proxy" Apr 24 16:58:13.274748 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.274727 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.277314 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.277292 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-77cb4-predictor-serving-cert\"" Apr 24 16:58:13.277411 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.277299 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-77cb4-kube-rbac-proxy-sar-config\"" Apr 24 16:58:13.285307 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.285288 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx"] Apr 24 16:58:13.397544 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.397513 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8"] Apr 24 16:58:13.404792 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.402382 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d51023aa-dae5-4673-8760-e8cf52d4cacb-proxy-tls\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.404792 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.402435 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d51023aa-dae5-4673-8760-e8cf52d4cacb-success-200-isvc-77cb4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.404792 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.402477 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5jz\" (UniqueName: \"kubernetes.io/projected/d51023aa-dae5-4673-8760-e8cf52d4cacb-kube-api-access-qq5jz\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.406731 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.406710 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.409480 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.409462 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-77cb4-predictor-serving-cert\"" Apr 24 16:58:13.409571 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.409476 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-77cb4-kube-rbac-proxy-sar-config\"" Apr 24 16:58:13.425058 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.425022 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8"] Apr 24 16:58:13.443814 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.443788 2561 generic.go:358] "Generic (PLEG): container finished" podID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerID="f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a" exitCode=2 Apr 24 16:58:13.443919 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.443856 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" event={"ID":"92839f22-2bdc-4478-8460-1f9edbf864e9","Type":"ContainerDied","Data":"f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a"} Apr 24 16:58:13.445393 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.445372 2561 generic.go:358] "Generic (PLEG): container finished" podID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerID="c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b" exitCode=2 Apr 24 16:58:13.445478 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.445401 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" event={"ID":"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2","Type":"ContainerDied","Data":"c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b"} Apr 24 16:58:13.503628 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.503592 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d51023aa-dae5-4673-8760-e8cf52d4cacb-proxy-tls\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.503774 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.503635 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d51023aa-dae5-4673-8760-e8cf52d4cacb-success-200-isvc-77cb4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.503774 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.503667 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5jz\" (UniqueName: \"kubernetes.io/projected/d51023aa-dae5-4673-8760-e8cf52d4cacb-kube-api-access-qq5jz\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.503774 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.503702 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5735bf3c-3cfb-4792-825c-8d01882932ba-error-404-isvc-77cb4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.503913 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.503785 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sj5f\" (UniqueName: \"kubernetes.io/projected/5735bf3c-3cfb-4792-825c-8d01882932ba-kube-api-access-6sj5f\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.503913 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.503851 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5735bf3c-3cfb-4792-825c-8d01882932ba-proxy-tls\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.504373 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.504352 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d51023aa-dae5-4673-8760-e8cf52d4cacb-success-200-isvc-77cb4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.506030 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.506011 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d51023aa-dae5-4673-8760-e8cf52d4cacb-proxy-tls\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.513556 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.513534 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5jz\" (UniqueName: \"kubernetes.io/projected/d51023aa-dae5-4673-8760-e8cf52d4cacb-kube-api-access-qq5jz\") pod \"success-200-isvc-77cb4-predictor-79db889498-sjlnx\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.585767 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.585728 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:13.604286 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.604254 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sj5f\" (UniqueName: \"kubernetes.io/projected/5735bf3c-3cfb-4792-825c-8d01882932ba-kube-api-access-6sj5f\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.604395 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.604293 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5735bf3c-3cfb-4792-825c-8d01882932ba-proxy-tls\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.604395 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.604335 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5735bf3c-3cfb-4792-825c-8d01882932ba-error-404-isvc-77cb4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.604889 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.604866 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5735bf3c-3cfb-4792-825c-8d01882932ba-error-404-isvc-77cb4-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.606525 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.606506 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5735bf3c-3cfb-4792-825c-8d01882932ba-proxy-tls\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.612552 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.612533 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sj5f\" (UniqueName: \"kubernetes.io/projected/5735bf3c-3cfb-4792-825c-8d01882932ba-kube-api-access-6sj5f\") pod \"error-404-isvc-77cb4-predictor-6cd577bd49-pnql8\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.707519 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.707499 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx"] Apr 24 16:58:13.709500 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:58:13.709472 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd51023aa_dae5_4673_8760_e8cf52d4cacb.slice/crio-a30ee6a09a0a9337ab490a2a6a4af4f1b0928db08f880cfd71e0c6f571326875 WatchSource:0}: Error finding container a30ee6a09a0a9337ab490a2a6a4af4f1b0928db08f880cfd71e0c6f571326875: Status 404 returned error can't find the container with id a30ee6a09a0a9337ab490a2a6a4af4f1b0928db08f880cfd71e0c6f571326875 Apr 24 16:58:13.716498 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.716477 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:13.847027 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:13.847005 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8"] Apr 24 16:58:13.848934 ip-10-0-128-44 kubenswrapper[2561]: W0424 16:58:13.848912 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5735bf3c_3cfb_4792_825c_8d01882932ba.slice/crio-e8c557e5519bafd045a255a4a1635641d6afb68b399d34fe660175a1bb96e000 WatchSource:0}: Error finding container e8c557e5519bafd045a255a4a1635641d6afb68b399d34fe660175a1bb96e000: Status 404 returned error can't find the container with id e8c557e5519bafd045a255a4a1635641d6afb68b399d34fe660175a1bb96e000 Apr 24 16:58:14.450184 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.450148 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" event={"ID":"5735bf3c-3cfb-4792-825c-8d01882932ba","Type":"ContainerStarted","Data":"f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f"} Apr 24 16:58:14.450184 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.450189 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" event={"ID":"5735bf3c-3cfb-4792-825c-8d01882932ba","Type":"ContainerStarted","Data":"96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef"} Apr 24 16:58:14.450663 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.450354 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:14.450663 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.450383 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:14.450663 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.450396 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" event={"ID":"5735bf3c-3cfb-4792-825c-8d01882932ba","Type":"ContainerStarted","Data":"e8c557e5519bafd045a255a4a1635641d6afb68b399d34fe660175a1bb96e000"} Apr 24 16:58:14.451536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.451509 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" event={"ID":"d51023aa-dae5-4673-8760-e8cf52d4cacb","Type":"ContainerStarted","Data":"f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9"} Apr 24 16:58:14.451536 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.451536 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" event={"ID":"d51023aa-dae5-4673-8760-e8cf52d4cacb","Type":"ContainerStarted","Data":"04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6"} Apr 24 16:58:14.451685 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.451547 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" event={"ID":"d51023aa-dae5-4673-8760-e8cf52d4cacb","Type":"ContainerStarted","Data":"a30ee6a09a0a9337ab490a2a6a4af4f1b0928db08f880cfd71e0c6f571326875"} Apr 24 16:58:14.451724 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.451688 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:14.451724 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.451685 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 16:58:14.529812 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.529762 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podStartSLOduration=1.5297490150000002 podStartE2EDuration="1.529749015s" podCreationTimestamp="2026-04-24 16:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:58:14.528382464 +0000 UTC m=+1163.381253176" watchObservedRunningTime="2026-04-24 16:58:14.529749015 +0000 UTC m=+1163.382619727" Apr 24 16:58:14.530935 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:14.530907 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podStartSLOduration=1.530896694 podStartE2EDuration="1.530896694s" podCreationTimestamp="2026-04-24 16:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:58:14.498558966 +0000 UTC m=+1163.351429677" watchObservedRunningTime="2026-04-24 16:58:14.530896694 +0000 UTC m=+1163.383767403" Apr 24 16:58:15.335030 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:15.334990 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 24 16:58:15.335363 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:15.335333 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 16:58:15.454898 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:15.454859 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 16:58:15.454898 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:15.454889 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:15.456075 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:15.456052 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 16:58:16.200589 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.200559 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 24 16:58:16.406315 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.406289 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:58:16.459242 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.459216 2561 generic.go:358] "Generic (PLEG): container finished" podID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerID="874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31" exitCode=0 Apr 24 16:58:16.459585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.459259 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" event={"ID":"92839f22-2bdc-4478-8460-1f9edbf864e9","Type":"ContainerDied","Data":"874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31"} Apr 24 16:58:16.459585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.459280 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" Apr 24 16:58:16.459585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.459296 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv" event={"ID":"92839f22-2bdc-4478-8460-1f9edbf864e9","Type":"ContainerDied","Data":"21a2118454f3c1ab47630e69522d27ac354cf60c631a4343b25cc40001122851"} Apr 24 16:58:16.459585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.459313 2561 scope.go:117] "RemoveContainer" containerID="f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a" Apr 24 16:58:16.459777 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.459756 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 16:58:16.466807 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.466788 2561 scope.go:117] "RemoveContainer" containerID="874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31" Apr 24 16:58:16.475803 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.475773 2561 scope.go:117] "RemoveContainer" containerID="f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a" Apr 24 16:58:16.476224 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:58:16.476197 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a\": container with ID starting with f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a not found: ID does not exist" containerID="f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a" Apr 24 16:58:16.476332 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.476234 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a"} err="failed to get container status \"f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a\": rpc error: code = NotFound desc = could not find container \"f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a\": container with ID starting with f31effb804f9b45fe75aeb367b7232c2b89d598ada719ce2d9d9d393467c095a not found: ID does not exist" Apr 24 16:58:16.476332 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.476254 2561 scope.go:117] "RemoveContainer" containerID="874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31" Apr 24 16:58:16.476516 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:58:16.476500 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31\": container with ID starting with 874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31 not found: ID does not exist" containerID="874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31" Apr 24 16:58:16.476580 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.476520 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31"} err="failed to get container status \"874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31\": rpc error: code = NotFound desc = could not find container \"874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31\": container with ID starting with 874fd37b87e8caa351339f9e0b8de351fa5b644d0bd964ab3b0045429abdda31 not found: ID does not exist" Apr 24 16:58:16.529964 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.529941 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cgsk\" (UniqueName: \"kubernetes.io/projected/92839f22-2bdc-4478-8460-1f9edbf864e9-kube-api-access-6cgsk\") pod \"92839f22-2bdc-4478-8460-1f9edbf864e9\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " Apr 24 16:58:16.530076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.529982 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls\") pod \"92839f22-2bdc-4478-8460-1f9edbf864e9\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " Apr 24 16:58:16.530076 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.530037 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92839f22-2bdc-4478-8460-1f9edbf864e9-error-404-isvc-3eac5-kube-rbac-proxy-sar-config\") pod \"92839f22-2bdc-4478-8460-1f9edbf864e9\" (UID: \"92839f22-2bdc-4478-8460-1f9edbf864e9\") " Apr 24 16:58:16.530411 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.530387 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92839f22-2bdc-4478-8460-1f9edbf864e9-error-404-isvc-3eac5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-3eac5-kube-rbac-proxy-sar-config") pod "92839f22-2bdc-4478-8460-1f9edbf864e9" (UID: "92839f22-2bdc-4478-8460-1f9edbf864e9"). InnerVolumeSpecName "error-404-isvc-3eac5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:58:16.531981 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.531953 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "92839f22-2bdc-4478-8460-1f9edbf864e9" (UID: "92839f22-2bdc-4478-8460-1f9edbf864e9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:58:16.532051 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.532025 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92839f22-2bdc-4478-8460-1f9edbf864e9-kube-api-access-6cgsk" (OuterVolumeSpecName: "kube-api-access-6cgsk") pod "92839f22-2bdc-4478-8460-1f9edbf864e9" (UID: "92839f22-2bdc-4478-8460-1f9edbf864e9"). InnerVolumeSpecName "kube-api-access-6cgsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:58:16.631165 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.631133 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cgsk\" (UniqueName: \"kubernetes.io/projected/92839f22-2bdc-4478-8460-1f9edbf864e9-kube-api-access-6cgsk\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:58:16.631165 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.631165 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92839f22-2bdc-4478-8460-1f9edbf864e9-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:58:16.631317 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.631183 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/92839f22-2bdc-4478-8460-1f9edbf864e9-error-404-isvc-3eac5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:58:16.785616 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.785589 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv"] Apr 24 16:58:16.792205 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.792178 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3eac5-predictor-5466454db9-flmsv"] Apr 24 16:58:16.959595 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:16.959574 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:58:17.137583 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.137507 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-success-200-isvc-3eac5-kube-rbac-proxy-sar-config\") pod \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " Apr 24 16:58:17.137716 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.137595 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls\") pod \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " Apr 24 16:58:17.137716 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.137637 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldvz8\" (UniqueName: \"kubernetes.io/projected/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-kube-api-access-ldvz8\") pod \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\" (UID: \"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2\") " Apr 24 16:58:17.137846 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.137823 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-success-200-isvc-3eac5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-3eac5-kube-rbac-proxy-sar-config") pod "9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" (UID: "9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2"). InnerVolumeSpecName "success-200-isvc-3eac5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:58:17.139614 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.139595 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-kube-api-access-ldvz8" (OuterVolumeSpecName: "kube-api-access-ldvz8") pod "9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" (UID: "9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2"). InnerVolumeSpecName "kube-api-access-ldvz8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:58:17.139661 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.139627 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" (UID: "9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:58:17.238317 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.238290 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:58:17.238317 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.238316 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ldvz8\" (UniqueName: \"kubernetes.io/projected/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-kube-api-access-ldvz8\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:58:17.238489 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.238328 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-3eac5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2-success-200-isvc-3eac5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 16:58:17.465063 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.465032 2561 generic.go:358] "Generic (PLEG): container finished" podID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerID="9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237" exitCode=0 Apr 24 16:58:17.465585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.465131 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" event={"ID":"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2","Type":"ContainerDied","Data":"9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237"} Apr 24 16:58:17.465585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.465171 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" Apr 24 16:58:17.465585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.465184 2561 scope.go:117] "RemoveContainer" containerID="c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b" Apr 24 16:58:17.465585 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.465174 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6" event={"ID":"9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2","Type":"ContainerDied","Data":"b513609b2db068efc79e971712dae468f7103dec6ecaaf14236c9a71e88b4cbb"} Apr 24 16:58:17.476267 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.476251 2561 scope.go:117] "RemoveContainer" containerID="9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237" Apr 24 16:58:17.483238 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.483217 2561 scope.go:117] "RemoveContainer" containerID="c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b" Apr 24 16:58:17.483465 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:58:17.483444 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b\": container with ID starting with c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b not found: ID does not exist" containerID="c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b" Apr 24 16:58:17.483526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.483476 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b"} err="failed to get container status \"c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b\": rpc error: code = NotFound desc = could not find container \"c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b\": container with ID starting with c01cf7b663c22bd5bd8d25f2c863b0b454a64ad5f4ecc1a9f53ee995e9bb252b not found: ID does not exist" Apr 24 16:58:17.483526 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.483500 2561 scope.go:117] "RemoveContainer" containerID="9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237" Apr 24 16:58:17.483712 ip-10-0-128-44 kubenswrapper[2561]: E0424 16:58:17.483695 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237\": container with ID starting with 9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237 not found: ID does not exist" containerID="9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237" Apr 24 16:58:17.483750 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.483717 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237"} err="failed to get container status \"9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237\": rpc error: code = NotFound desc = could not find container \"9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237\": container with ID starting with 9c15c3b3be85b503c9d20aeec0208209af50bd6688421d3271c61f0e06ca0237 not found: ID does not exist" Apr 24 16:58:17.488228 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.488210 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6"] Apr 24 16:58:17.493799 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.493782 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3eac5-predictor-5dcfddbf4b-456v6"] Apr 24 16:58:17.651325 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.651299 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" path="/var/lib/kubelet/pods/92839f22-2bdc-4478-8460-1f9edbf864e9/volumes" Apr 24 16:58:17.651683 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:17.651671 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" path="/var/lib/kubelet/pods/9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2/volumes" Apr 24 16:58:20.459130 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:20.459085 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:58:20.459645 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:20.459618 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 16:58:21.464633 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:21.464603 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 16:58:21.465152 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:21.465108 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 16:58:25.335392 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:25.335365 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 16:58:25.336066 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:25.336049 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 16:58:30.460031 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:30.459989 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 16:58:31.465496 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:31.465460 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 16:58:40.459682 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:40.459642 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 16:58:41.465954 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:41.465918 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 16:58:50.459961 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:50.459925 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 16:58:51.465888 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:58:51.465853 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 24 16:59:00.460310 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:59:00.460274 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 16:59:01.466304 ip-10-0-128-44 kubenswrapper[2561]: I0424 16:59:01.466264 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 17:06:52.179096 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.179062 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb"] Apr 24 17:06:52.179668 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.179370 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" containerID="cri-o://935c98947ea71251ed14e0039b6841dbdb09e6259184d5b247182ce778f8f969" gracePeriod=30 Apr 24 17:06:52.179668 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.179405 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kube-rbac-proxy" containerID="cri-o://c8f9f94b74e365867f95e1307e2857adc2be13bba8878b4c1fafeeb68a732709" gracePeriod=30 Apr 24 17:06:52.247154 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.247100 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786"] Apr 24 17:06:52.247467 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.247433 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" containerID="cri-o://21e1989363af2ca3e2b86ce0cf017d0d3c38c4551ed4a5c6228eb7a70db4acd7" gracePeriod=30 Apr 24 17:06:52.247579 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.247491 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kube-rbac-proxy" containerID="cri-o://5c755b61f0fecd0a7ee6d363d3f19b0d92ccf06ed5ff26e915397bda3f110e02" gracePeriod=30 Apr 24 17:06:52.256072 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256050 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x"] Apr 24 17:06:52.256395 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256383 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kube-rbac-proxy" Apr 24 17:06:52.256452 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256397 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kube-rbac-proxy" Apr 24 17:06:52.256452 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256406 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" Apr 24 17:06:52.256452 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256412 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" Apr 24 17:06:52.256452 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256427 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kube-rbac-proxy" Apr 24 17:06:52.256452 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256434 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kube-rbac-proxy" Apr 24 17:06:52.256452 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256450 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" Apr 24 17:06:52.256736 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256458 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" Apr 24 17:06:52.256736 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256526 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kube-rbac-proxy" Apr 24 17:06:52.256736 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256539 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kube-rbac-proxy" Apr 24 17:06:52.256736 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256549 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="92839f22-2bdc-4478-8460-1f9edbf864e9" containerName="kserve-container" Apr 24 17:06:52.256736 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.256560 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e8d6087-9ff2-4eb9-b90f-ae8f84bd9cc2" containerName="kserve-container" Apr 24 17:06:52.259640 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.259623 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.263155 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.263137 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-3827a-kube-rbac-proxy-sar-config\"" Apr 24 17:06:52.263329 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.263312 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-3827a-predictor-serving-cert\"" Apr 24 17:06:52.280457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.280435 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x"] Apr 24 17:06:52.321754 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.321721 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259"] Apr 24 17:06:52.325519 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.325500 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.327934 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.327908 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-3827a-kube-rbac-proxy-sar-config\"" Apr 24 17:06:52.328034 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.327935 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-3827a-predictor-serving-cert\"" Apr 24 17:06:52.336563 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.336541 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259"] Apr 24 17:06:52.386904 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.386882 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl4x4\" (UniqueName: \"kubernetes.io/projected/ded5d261-7d83-4e14-bcb0-56c408e51992-kube-api-access-tl4x4\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.387003 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.386937 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae47a928-b7d6-4a05-9a47-2cf95df423ca-proxy-tls\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.387003 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.386956 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae47a928-b7d6-4a05-9a47-2cf95df423ca-success-200-isvc-3827a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.387003 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.386974 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5wfw\" (UniqueName: \"kubernetes.io/projected/ae47a928-b7d6-4a05-9a47-2cf95df423ca-kube-api-access-f5wfw\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.387130 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.387063 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ded5d261-7d83-4e14-bcb0-56c408e51992-proxy-tls\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.387184 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.387127 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ded5d261-7d83-4e14-bcb0-56c408e51992-error-404-isvc-3827a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.488184 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.488147 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae47a928-b7d6-4a05-9a47-2cf95df423ca-proxy-tls\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.488184 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.488186 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae47a928-b7d6-4a05-9a47-2cf95df423ca-success-200-isvc-3827a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.488412 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.488206 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5wfw\" (UniqueName: \"kubernetes.io/projected/ae47a928-b7d6-4a05-9a47-2cf95df423ca-kube-api-access-f5wfw\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.488412 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.488232 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ded5d261-7d83-4e14-bcb0-56c408e51992-proxy-tls\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.488412 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.488260 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ded5d261-7d83-4e14-bcb0-56c408e51992-error-404-isvc-3827a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.488412 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.488295 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl4x4\" (UniqueName: \"kubernetes.io/projected/ded5d261-7d83-4e14-bcb0-56c408e51992-kube-api-access-tl4x4\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.488879 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.488853 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae47a928-b7d6-4a05-9a47-2cf95df423ca-success-200-isvc-3827a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.489014 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.488981 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ded5d261-7d83-4e14-bcb0-56c408e51992-error-404-isvc-3827a-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.490473 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.490453 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae47a928-b7d6-4a05-9a47-2cf95df423ca-proxy-tls\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.490594 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.490578 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ded5d261-7d83-4e14-bcb0-56c408e51992-proxy-tls\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.496073 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.496047 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl4x4\" (UniqueName: \"kubernetes.io/projected/ded5d261-7d83-4e14-bcb0-56c408e51992-kube-api-access-tl4x4\") pod \"error-404-isvc-3827a-predictor-69b76c5cb9-hq259\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.496171 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.496085 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5wfw\" (UniqueName: \"kubernetes.io/projected/ae47a928-b7d6-4a05-9a47-2cf95df423ca-kube-api-access-f5wfw\") pod \"success-200-isvc-3827a-predictor-7f98b8946b-25n6x\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.571775 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.571750 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:52.637381 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.637349 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:52.688661 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.688625 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x"] Apr 24 17:06:52.692086 ip-10-0-128-44 kubenswrapper[2561]: W0424 17:06:52.692052 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae47a928_b7d6_4a05_9a47_2cf95df423ca.slice/crio-bae4aa7f8c0234723c42b954c446306eef2154a0415ac56ce623830664cf31a0 WatchSource:0}: Error finding container bae4aa7f8c0234723c42b954c446306eef2154a0415ac56ce623830664cf31a0: Status 404 returned error can't find the container with id bae4aa7f8c0234723c42b954c446306eef2154a0415ac56ce623830664cf31a0 Apr 24 17:06:52.694251 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.694232 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:06:52.766735 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:52.766707 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259"] Apr 24 17:06:52.769522 ip-10-0-128-44 kubenswrapper[2561]: W0424 17:06:52.769483 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podded5d261_7d83_4e14_bcb0_56c408e51992.slice/crio-fe759162975bdcc041e9d77ce311ec154362f130fae7728f4d195ec4c0c5849a WatchSource:0}: Error finding container fe759162975bdcc041e9d77ce311ec154362f130fae7728f4d195ec4c0c5849a: Status 404 returned error can't find the container with id fe759162975bdcc041e9d77ce311ec154362f130fae7728f4d195ec4c0c5849a Apr 24 17:06:53.041465 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.041371 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" event={"ID":"ded5d261-7d83-4e14-bcb0-56c408e51992","Type":"ContainerStarted","Data":"896c9fc48e23aa1fb9ad9d9b7cbdc39435f546dccb885446191c88ff8e11f2fe"} Apr 24 17:06:53.041465 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.041423 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" event={"ID":"ded5d261-7d83-4e14-bcb0-56c408e51992","Type":"ContainerStarted","Data":"b8953e6ce84cea8fd0f8522647e83467ba2b4667f1ab7217d95fee1af4a13694"} Apr 24 17:06:53.041465 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.041437 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" event={"ID":"ded5d261-7d83-4e14-bcb0-56c408e51992","Type":"ContainerStarted","Data":"fe759162975bdcc041e9d77ce311ec154362f130fae7728f4d195ec4c0c5849a"} Apr 24 17:06:53.041727 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.041534 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:53.042882 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.042858 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" event={"ID":"ae47a928-b7d6-4a05-9a47-2cf95df423ca","Type":"ContainerStarted","Data":"6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf"} Apr 24 17:06:53.042882 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.042884 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" event={"ID":"ae47a928-b7d6-4a05-9a47-2cf95df423ca","Type":"ContainerStarted","Data":"f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e"} Apr 24 17:06:53.043080 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.042894 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" event={"ID":"ae47a928-b7d6-4a05-9a47-2cf95df423ca","Type":"ContainerStarted","Data":"bae4aa7f8c0234723c42b954c446306eef2154a0415ac56ce623830664cf31a0"} Apr 24 17:06:53.043162 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.043085 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:53.043162 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.043102 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:53.044404 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.044373 2561 generic.go:358] "Generic (PLEG): container finished" podID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerID="c8f9f94b74e365867f95e1307e2857adc2be13bba8878b4c1fafeeb68a732709" exitCode=2 Apr 24 17:06:53.044526 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.044440 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" event={"ID":"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9","Type":"ContainerDied","Data":"c8f9f94b74e365867f95e1307e2857adc2be13bba8878b4c1fafeeb68a732709"} Apr 24 17:06:53.044526 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.044503 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 17:06:53.045901 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.045877 2561 generic.go:358] "Generic (PLEG): container finished" podID="28719115-6329-46e4-8521-9fbf569b1aca" containerID="5c755b61f0fecd0a7ee6d363d3f19b0d92ccf06ed5ff26e915397bda3f110e02" exitCode=2 Apr 24 17:06:53.046033 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.045910 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" event={"ID":"28719115-6329-46e4-8521-9fbf569b1aca","Type":"ContainerDied","Data":"5c755b61f0fecd0a7ee6d363d3f19b0d92ccf06ed5ff26e915397bda3f110e02"} Apr 24 17:06:53.061261 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.061225 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podStartSLOduration=1.061213202 podStartE2EDuration="1.061213202s" podCreationTimestamp="2026-04-24 17:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:06:53.05927318 +0000 UTC m=+1681.912143891" watchObservedRunningTime="2026-04-24 17:06:53.061213202 +0000 UTC m=+1681.914083915" Apr 24 17:06:53.076699 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:53.076646 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podStartSLOduration=1.076628407 podStartE2EDuration="1.076628407s" podCreationTimestamp="2026-04-24 17:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:06:53.075049161 +0000 UTC m=+1681.927919873" watchObservedRunningTime="2026-04-24 17:06:53.076628407 +0000 UTC m=+1681.929499123" Apr 24 17:06:54.049154 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:54.049126 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:06:54.049154 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:54.049107 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 17:06:54.050258 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:54.050228 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 17:06:55.055591 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.055553 2561 generic.go:358] "Generic (PLEG): container finished" podID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerID="935c98947ea71251ed14e0039b6841dbdb09e6259184d5b247182ce778f8f969" exitCode=0 Apr 24 17:06:55.055891 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.055619 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" event={"ID":"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9","Type":"ContainerDied","Data":"935c98947ea71251ed14e0039b6841dbdb09e6259184d5b247182ce778f8f969"} Apr 24 17:06:55.055965 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.055946 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 17:06:55.116434 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.116409 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 17:06:55.209028 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.208995 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42vqb\" (UniqueName: \"kubernetes.io/projected/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-kube-api-access-42vqb\") pod \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " Apr 24 17:06:55.209218 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.209033 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-success-200-isvc-510ff-kube-rbac-proxy-sar-config\") pod \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " Apr 24 17:06:55.209218 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.209066 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-proxy-tls\") pod \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\" (UID: \"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9\") " Apr 24 17:06:55.209414 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.209391 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-success-200-isvc-510ff-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-510ff-kube-rbac-proxy-sar-config") pod "2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" (UID: "2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9"). InnerVolumeSpecName "success-200-isvc-510ff-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:06:55.211066 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.211042 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-kube-api-access-42vqb" (OuterVolumeSpecName: "kube-api-access-42vqb") pod "2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" (UID: "2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9"). InnerVolumeSpecName "kube-api-access-42vqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:06:55.211153 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.211046 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" (UID: "2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:06:55.309815 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.309743 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-42vqb\" (UniqueName: \"kubernetes.io/projected/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-kube-api-access-42vqb\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:06:55.309815 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.309768 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-success-200-isvc-510ff-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:06:55.309815 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.309780 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:06:55.330946 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.330921 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 24 17:06:55.336108 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:55.336086 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 24 17:06:56.061193 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.061157 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" event={"ID":"2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9","Type":"ContainerDied","Data":"319fb09dd1b0fe5ade154118a5548568734290f5c39209b85ba421e17e4a5eae"} Apr 24 17:06:56.061588 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.061211 2561 scope.go:117] "RemoveContainer" containerID="c8f9f94b74e365867f95e1307e2857adc2be13bba8878b4c1fafeeb68a732709" Apr 24 17:06:56.061588 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.061175 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb" Apr 24 17:06:56.062985 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.062945 2561 generic.go:358] "Generic (PLEG): container finished" podID="28719115-6329-46e4-8521-9fbf569b1aca" containerID="21e1989363af2ca3e2b86ce0cf017d0d3c38c4551ed4a5c6228eb7a70db4acd7" exitCode=0 Apr 24 17:06:56.062985 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.062971 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" event={"ID":"28719115-6329-46e4-8521-9fbf569b1aca","Type":"ContainerDied","Data":"21e1989363af2ca3e2b86ce0cf017d0d3c38c4551ed4a5c6228eb7a70db4acd7"} Apr 24 17:06:56.069304 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.069288 2561 scope.go:117] "RemoveContainer" containerID="935c98947ea71251ed14e0039b6841dbdb09e6259184d5b247182ce778f8f969" Apr 24 17:06:56.080603 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.080582 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb"] Apr 24 17:06:56.083403 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.083383 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-510ff-predictor-7d689d4d56-kfdtb"] Apr 24 17:06:56.092305 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.092288 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 17:06:56.218096 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.218019 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/28719115-6329-46e4-8521-9fbf569b1aca-error-404-isvc-510ff-kube-rbac-proxy-sar-config\") pod \"28719115-6329-46e4-8521-9fbf569b1aca\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " Apr 24 17:06:56.218096 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.218056 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28719115-6329-46e4-8521-9fbf569b1aca-proxy-tls\") pod \"28719115-6329-46e4-8521-9fbf569b1aca\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " Apr 24 17:06:56.218319 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.218135 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kksp\" (UniqueName: \"kubernetes.io/projected/28719115-6329-46e4-8521-9fbf569b1aca-kube-api-access-4kksp\") pod \"28719115-6329-46e4-8521-9fbf569b1aca\" (UID: \"28719115-6329-46e4-8521-9fbf569b1aca\") " Apr 24 17:06:56.218420 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.218401 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28719115-6329-46e4-8521-9fbf569b1aca-error-404-isvc-510ff-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-510ff-kube-rbac-proxy-sar-config") pod "28719115-6329-46e4-8521-9fbf569b1aca" (UID: "28719115-6329-46e4-8521-9fbf569b1aca"). InnerVolumeSpecName "error-404-isvc-510ff-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:06:56.220152 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.220106 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28719115-6329-46e4-8521-9fbf569b1aca-kube-api-access-4kksp" (OuterVolumeSpecName: "kube-api-access-4kksp") pod "28719115-6329-46e4-8521-9fbf569b1aca" (UID: "28719115-6329-46e4-8521-9fbf569b1aca"). InnerVolumeSpecName "kube-api-access-4kksp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:06:56.220273 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.220236 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28719115-6329-46e4-8521-9fbf569b1aca-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "28719115-6329-46e4-8521-9fbf569b1aca" (UID: "28719115-6329-46e4-8521-9fbf569b1aca"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:06:56.318807 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.318770 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kksp\" (UniqueName: \"kubernetes.io/projected/28719115-6329-46e4-8521-9fbf569b1aca-kube-api-access-4kksp\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:06:56.318807 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.318804 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-510ff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/28719115-6329-46e4-8521-9fbf569b1aca-error-404-isvc-510ff-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:06:56.318807 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:56.318814 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/28719115-6329-46e4-8521-9fbf569b1aca-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:06:57.069228 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:57.069186 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" event={"ID":"28719115-6329-46e4-8521-9fbf569b1aca","Type":"ContainerDied","Data":"47745d39c03be646cd2437bd94b2aa80e642492f2ab9b322cc836ff3b2cdd318"} Apr 24 17:06:57.069228 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:57.069214 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786" Apr 24 17:06:57.069735 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:57.069248 2561 scope.go:117] "RemoveContainer" containerID="5c755b61f0fecd0a7ee6d363d3f19b0d92ccf06ed5ff26e915397bda3f110e02" Apr 24 17:06:57.078025 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:57.078004 2561 scope.go:117] "RemoveContainer" containerID="21e1989363af2ca3e2b86ce0cf017d0d3c38c4551ed4a5c6228eb7a70db4acd7" Apr 24 17:06:57.090800 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:57.090779 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786"] Apr 24 17:06:57.094297 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:57.094278 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-510ff-predictor-7b675fd9f5-vq786"] Apr 24 17:06:57.650550 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:57.650520 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28719115-6329-46e4-8521-9fbf569b1aca" path="/var/lib/kubelet/pods/28719115-6329-46e4-8521-9fbf569b1aca/volumes" Apr 24 17:06:57.650906 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:57.650894 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" path="/var/lib/kubelet/pods/2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9/volumes" Apr 24 17:06:59.053001 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:59.052973 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:06:59.053437 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:06:59.053413 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 17:07:00.059717 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:00.059690 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:07:00.060231 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:00.060204 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 17:07:09.053490 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:09.053451 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 17:07:10.060728 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:10.060690 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 17:07:19.053930 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:19.053848 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 17:07:20.060229 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:20.060192 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 17:07:28.014422 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.014387 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx"] Apr 24 17:07:28.015068 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.015032 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" containerID="cri-o://04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6" gracePeriod=30 Apr 24 17:07:28.015233 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.015089 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kube-rbac-proxy" containerID="cri-o://f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9" gracePeriod=30 Apr 24 17:07:28.071136 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.070916 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679"] Apr 24 17:07:28.071435 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071413 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kube-rbac-proxy" Apr 24 17:07:28.071435 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071434 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kube-rbac-proxy" Apr 24 17:07:28.071607 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071457 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" Apr 24 17:07:28.071607 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071466 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" Apr 24 17:07:28.071607 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071487 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kube-rbac-proxy" Apr 24 17:07:28.071607 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071496 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kube-rbac-proxy" Apr 24 17:07:28.071607 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071506 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" Apr 24 17:07:28.071607 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071514 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" Apr 24 17:07:28.071607 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071588 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kserve-container" Apr 24 17:07:28.071607 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071601 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kserve-container" Apr 24 17:07:28.072004 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071611 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="28719115-6329-46e4-8521-9fbf569b1aca" containerName="kube-rbac-proxy" Apr 24 17:07:28.072004 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.071624 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b34bdd3-4fb5-4a3e-a5f0-5318dbf718a9" containerName="kube-rbac-proxy" Apr 24 17:07:28.075646 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.075627 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.078507 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.078488 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-8edb5-kube-rbac-proxy-sar-config\"" Apr 24 17:07:28.078610 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.078492 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-8edb5-predictor-serving-cert\"" Apr 24 17:07:28.084181 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.084159 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679"] Apr 24 17:07:28.094707 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.094684 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8"] Apr 24 17:07:28.094953 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.094917 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" containerID="cri-o://96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef" gracePeriod=30 Apr 24 17:07:28.095031 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.095008 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kube-rbac-proxy" containerID="cri-o://f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f" gracePeriod=30 Apr 24 17:07:28.170158 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.170101 2561 generic.go:358] "Generic (PLEG): container finished" podID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerID="f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9" exitCode=2 Apr 24 17:07:28.170488 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.170196 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" event={"ID":"d51023aa-dae5-4673-8760-e8cf52d4cacb","Type":"ContainerDied","Data":"f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9"} Apr 24 17:07:28.178348 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.178323 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj"] Apr 24 17:07:28.181597 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.181582 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.184013 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.183989 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-8edb5-kube-rbac-proxy-sar-config\"" Apr 24 17:07:28.184163 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.184059 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-8edb5-predictor-serving-cert\"" Apr 24 17:07:28.191549 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.191527 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj"] Apr 24 17:07:28.275130 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.275026 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-success-200-isvc-8edb5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.275130 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.275089 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81afd582-5ad6-4837-93c1-20edce9f24c9-proxy-tls\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.275341 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.275139 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ndmb\" (UniqueName: \"kubernetes.io/projected/81afd582-5ad6-4837-93c1-20edce9f24c9-kube-api-access-9ndmb\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.275341 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.275179 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81afd582-5ad6-4837-93c1-20edce9f24c9-error-404-isvc-8edb5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.275341 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.275236 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcf94\" (UniqueName: \"kubernetes.io/projected/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-kube-api-access-zcf94\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.275341 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.275264 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-proxy-tls\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.376486 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.376458 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-success-200-isvc-8edb5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.376663 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.376503 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81afd582-5ad6-4837-93c1-20edce9f24c9-proxy-tls\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.376663 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.376530 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ndmb\" (UniqueName: \"kubernetes.io/projected/81afd582-5ad6-4837-93c1-20edce9f24c9-kube-api-access-9ndmb\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.376663 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.376553 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81afd582-5ad6-4837-93c1-20edce9f24c9-error-404-isvc-8edb5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.376663 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.376595 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcf94\" (UniqueName: \"kubernetes.io/projected/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-kube-api-access-zcf94\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.376663 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.376619 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-proxy-tls\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.377181 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.377157 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-success-200-isvc-8edb5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.377303 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.377205 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81afd582-5ad6-4837-93c1-20edce9f24c9-error-404-isvc-8edb5-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.378876 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.378857 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81afd582-5ad6-4837-93c1-20edce9f24c9-proxy-tls\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.378951 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.378885 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-proxy-tls\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.385271 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.385250 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ndmb\" (UniqueName: \"kubernetes.io/projected/81afd582-5ad6-4837-93c1-20edce9f24c9-kube-api-access-9ndmb\") pod \"error-404-isvc-8edb5-predictor-667c7d688c-fwxdj\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.385411 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.385389 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcf94\" (UniqueName: \"kubernetes.io/projected/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-kube-api-access-zcf94\") pod \"success-200-isvc-8edb5-predictor-7666dfcc56-g6679\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.387099 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.387080 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:28.492920 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.492892 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:28.509539 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.509518 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679"] Apr 24 17:07:28.511783 ip-10-0-128-44 kubenswrapper[2561]: W0424 17:07:28.511751 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b25dd1a_05c9_4bc1_9002_cc05dbe16ab0.slice/crio-144395fc912e59d9511526b310beec18c8fd0cb8f17ce496ec9aa97165cd4c5a WatchSource:0}: Error finding container 144395fc912e59d9511526b310beec18c8fd0cb8f17ce496ec9aa97165cd4c5a: Status 404 returned error can't find the container with id 144395fc912e59d9511526b310beec18c8fd0cb8f17ce496ec9aa97165cd4c5a Apr 24 17:07:28.621986 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:28.621955 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj"] Apr 24 17:07:28.623805 ip-10-0-128-44 kubenswrapper[2561]: W0424 17:07:28.623784 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81afd582_5ad6_4837_93c1_20edce9f24c9.slice/crio-32e03ab0ffd0565b61d6c875d83677a6cfe09475b3a2b5c1a743bcfbefc3d51f WatchSource:0}: Error finding container 32e03ab0ffd0565b61d6c875d83677a6cfe09475b3a2b5c1a743bcfbefc3d51f: Status 404 returned error can't find the container with id 32e03ab0ffd0565b61d6c875d83677a6cfe09475b3a2b5c1a743bcfbefc3d51f Apr 24 17:07:29.053943 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.053912 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 24 17:07:29.176161 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.175991 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" event={"ID":"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0","Type":"ContainerStarted","Data":"934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374"} Apr 24 17:07:29.176161 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.176030 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" event={"ID":"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0","Type":"ContainerStarted","Data":"a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe"} Apr 24 17:07:29.176161 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.176045 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" event={"ID":"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0","Type":"ContainerStarted","Data":"144395fc912e59d9511526b310beec18c8fd0cb8f17ce496ec9aa97165cd4c5a"} Apr 24 17:07:29.176469 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.176202 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:29.176469 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.176236 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:29.177409 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.177377 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 17:07:29.178076 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.178055 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" event={"ID":"81afd582-5ad6-4837-93c1-20edce9f24c9","Type":"ContainerStarted","Data":"995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca"} Apr 24 17:07:29.178214 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.178083 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" event={"ID":"81afd582-5ad6-4837-93c1-20edce9f24c9","Type":"ContainerStarted","Data":"a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08"} Apr 24 17:07:29.178214 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.178100 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" event={"ID":"81afd582-5ad6-4837-93c1-20edce9f24c9","Type":"ContainerStarted","Data":"32e03ab0ffd0565b61d6c875d83677a6cfe09475b3a2b5c1a743bcfbefc3d51f"} Apr 24 17:07:29.178214 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.178189 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:29.179757 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.179736 2561 generic.go:358] "Generic (PLEG): container finished" podID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerID="f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f" exitCode=2 Apr 24 17:07:29.179866 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.179808 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" event={"ID":"5735bf3c-3cfb-4792-825c-8d01882932ba","Type":"ContainerDied","Data":"f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f"} Apr 24 17:07:29.195921 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.195885 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podStartSLOduration=1.195873288 podStartE2EDuration="1.195873288s" podCreationTimestamp="2026-04-24 17:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:07:29.194320399 +0000 UTC m=+1718.047191112" watchObservedRunningTime="2026-04-24 17:07:29.195873288 +0000 UTC m=+1718.048744001" Apr 24 17:07:29.216880 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:29.216838 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podStartSLOduration=1.216827229 podStartE2EDuration="1.216827229s" podCreationTimestamp="2026-04-24 17:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:07:29.215436154 +0000 UTC m=+1718.068306867" watchObservedRunningTime="2026-04-24 17:07:29.216827229 +0000 UTC m=+1718.069697941" Apr 24 17:07:30.060127 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:30.060076 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 24 17:07:30.183606 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:30.183571 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:30.183743 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:30.183622 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 17:07:30.184764 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:30.184738 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 17:07:30.455643 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:30.455609 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.42:8643/healthz\": dial tcp 10.133.0.42:8643: connect: connection refused" Apr 24 17:07:30.459874 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:30.459852 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 24 17:07:31.186566 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.186530 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 17:07:31.453462 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.453440 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 17:07:31.502483 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.502451 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq5jz\" (UniqueName: \"kubernetes.io/projected/d51023aa-dae5-4673-8760-e8cf52d4cacb-kube-api-access-qq5jz\") pod \"d51023aa-dae5-4673-8760-e8cf52d4cacb\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " Apr 24 17:07:31.502643 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.502491 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d51023aa-dae5-4673-8760-e8cf52d4cacb-success-200-isvc-77cb4-kube-rbac-proxy-sar-config\") pod \"d51023aa-dae5-4673-8760-e8cf52d4cacb\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " Apr 24 17:07:31.502643 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.502517 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d51023aa-dae5-4673-8760-e8cf52d4cacb-proxy-tls\") pod \"d51023aa-dae5-4673-8760-e8cf52d4cacb\" (UID: \"d51023aa-dae5-4673-8760-e8cf52d4cacb\") " Apr 24 17:07:31.502820 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.502800 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d51023aa-dae5-4673-8760-e8cf52d4cacb-success-200-isvc-77cb4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-77cb4-kube-rbac-proxy-sar-config") pod "d51023aa-dae5-4673-8760-e8cf52d4cacb" (UID: "d51023aa-dae5-4673-8760-e8cf52d4cacb"). InnerVolumeSpecName "success-200-isvc-77cb4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:07:31.504481 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.504457 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51023aa-dae5-4673-8760-e8cf52d4cacb-kube-api-access-qq5jz" (OuterVolumeSpecName: "kube-api-access-qq5jz") pod "d51023aa-dae5-4673-8760-e8cf52d4cacb" (UID: "d51023aa-dae5-4673-8760-e8cf52d4cacb"). InnerVolumeSpecName "kube-api-access-qq5jz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:07:31.504590 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.504550 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d51023aa-dae5-4673-8760-e8cf52d4cacb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d51023aa-dae5-4673-8760-e8cf52d4cacb" (UID: "d51023aa-dae5-4673-8760-e8cf52d4cacb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:07:31.603705 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.603664 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qq5jz\" (UniqueName: \"kubernetes.io/projected/d51023aa-dae5-4673-8760-e8cf52d4cacb-kube-api-access-qq5jz\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:07:31.603705 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.603696 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d51023aa-dae5-4673-8760-e8cf52d4cacb-success-200-isvc-77cb4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:07:31.603705 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.603706 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d51023aa-dae5-4673-8760-e8cf52d4cacb-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:07:31.845713 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.845684 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 17:07:31.905955 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.905926 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5735bf3c-3cfb-4792-825c-8d01882932ba-proxy-tls\") pod \"5735bf3c-3cfb-4792-825c-8d01882932ba\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " Apr 24 17:07:31.906133 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.905991 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sj5f\" (UniqueName: \"kubernetes.io/projected/5735bf3c-3cfb-4792-825c-8d01882932ba-kube-api-access-6sj5f\") pod \"5735bf3c-3cfb-4792-825c-8d01882932ba\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " Apr 24 17:07:31.906133 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.906050 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5735bf3c-3cfb-4792-825c-8d01882932ba-error-404-isvc-77cb4-kube-rbac-proxy-sar-config\") pod \"5735bf3c-3cfb-4792-825c-8d01882932ba\" (UID: \"5735bf3c-3cfb-4792-825c-8d01882932ba\") " Apr 24 17:07:31.906401 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.906380 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5735bf3c-3cfb-4792-825c-8d01882932ba-error-404-isvc-77cb4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-77cb4-kube-rbac-proxy-sar-config") pod "5735bf3c-3cfb-4792-825c-8d01882932ba" (UID: "5735bf3c-3cfb-4792-825c-8d01882932ba"). InnerVolumeSpecName "error-404-isvc-77cb4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:07:31.907970 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.907946 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5735bf3c-3cfb-4792-825c-8d01882932ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5735bf3c-3cfb-4792-825c-8d01882932ba" (UID: "5735bf3c-3cfb-4792-825c-8d01882932ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:07:31.907970 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:31.907946 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5735bf3c-3cfb-4792-825c-8d01882932ba-kube-api-access-6sj5f" (OuterVolumeSpecName: "kube-api-access-6sj5f") pod "5735bf3c-3cfb-4792-825c-8d01882932ba" (UID: "5735bf3c-3cfb-4792-825c-8d01882932ba"). InnerVolumeSpecName "kube-api-access-6sj5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:07:32.006855 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.006758 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6sj5f\" (UniqueName: \"kubernetes.io/projected/5735bf3c-3cfb-4792-825c-8d01882932ba-kube-api-access-6sj5f\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:07:32.006855 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.006799 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-77cb4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5735bf3c-3cfb-4792-825c-8d01882932ba-error-404-isvc-77cb4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:07:32.006855 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.006811 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5735bf3c-3cfb-4792-825c-8d01882932ba-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:07:32.191633 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.191602 2561 generic.go:358] "Generic (PLEG): container finished" podID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerID="04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6" exitCode=0 Apr 24 17:07:32.192028 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.191673 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" Apr 24 17:07:32.192028 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.191691 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" event={"ID":"d51023aa-dae5-4673-8760-e8cf52d4cacb","Type":"ContainerDied","Data":"04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6"} Apr 24 17:07:32.192028 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.191735 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx" event={"ID":"d51023aa-dae5-4673-8760-e8cf52d4cacb","Type":"ContainerDied","Data":"a30ee6a09a0a9337ab490a2a6a4af4f1b0928db08f880cfd71e0c6f571326875"} Apr 24 17:07:32.192028 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.191760 2561 scope.go:117] "RemoveContainer" containerID="f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9" Apr 24 17:07:32.193249 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.193226 2561 generic.go:358] "Generic (PLEG): container finished" podID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerID="96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef" exitCode=0 Apr 24 17:07:32.193360 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.193296 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" Apr 24 17:07:32.193360 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.193299 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" event={"ID":"5735bf3c-3cfb-4792-825c-8d01882932ba","Type":"ContainerDied","Data":"96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef"} Apr 24 17:07:32.193360 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.193323 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8" event={"ID":"5735bf3c-3cfb-4792-825c-8d01882932ba","Type":"ContainerDied","Data":"e8c557e5519bafd045a255a4a1635641d6afb68b399d34fe660175a1bb96e000"} Apr 24 17:07:32.199574 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.199553 2561 scope.go:117] "RemoveContainer" containerID="04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6" Apr 24 17:07:32.206576 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.206492 2561 scope.go:117] "RemoveContainer" containerID="f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9" Apr 24 17:07:32.206997 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:07:32.206972 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9\": container with ID starting with f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9 not found: ID does not exist" containerID="f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9" Apr 24 17:07:32.207073 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.207006 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9"} err="failed to get container status \"f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9\": rpc error: code = NotFound desc = could not find container \"f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9\": container with ID starting with f373a0fd36651c7be0bbb0a10baaefe057baf3103a44052f5db32f24f25873e9 not found: ID does not exist" Apr 24 17:07:32.207073 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.207029 2561 scope.go:117] "RemoveContainer" containerID="04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6" Apr 24 17:07:32.207331 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:07:32.207312 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6\": container with ID starting with 04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6 not found: ID does not exist" containerID="04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6" Apr 24 17:07:32.207405 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.207336 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6"} err="failed to get container status \"04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6\": rpc error: code = NotFound desc = could not find container \"04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6\": container with ID starting with 04eec3fc5902bc8416124380d2fbe400b4b9b731ec3a77f7effef7925ca6b8f6 not found: ID does not exist" Apr 24 17:07:32.207405 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.207352 2561 scope.go:117] "RemoveContainer" containerID="f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f" Apr 24 17:07:32.208563 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.208548 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx"] Apr 24 17:07:32.211980 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.211961 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-77cb4-predictor-79db889498-sjlnx"] Apr 24 17:07:32.214520 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.214505 2561 scope.go:117] "RemoveContainer" containerID="96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef" Apr 24 17:07:32.221086 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.221070 2561 scope.go:117] "RemoveContainer" containerID="f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f" Apr 24 17:07:32.221336 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:07:32.221320 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f\": container with ID starting with f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f not found: ID does not exist" containerID="f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f" Apr 24 17:07:32.221405 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.221339 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f"} err="failed to get container status \"f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f\": rpc error: code = NotFound desc = could not find container \"f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f\": container with ID starting with f9c6e078c7dcb35d6388b3d45c187128a6b0e4bdd2cee61ef40a3204643caf5f not found: ID does not exist" Apr 24 17:07:32.221405 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.221354 2561 scope.go:117] "RemoveContainer" containerID="96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef" Apr 24 17:07:32.221596 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:07:32.221575 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef\": container with ID starting with 96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef not found: ID does not exist" containerID="96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef" Apr 24 17:07:32.221650 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.221603 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef"} err="failed to get container status \"96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef\": rpc error: code = NotFound desc = could not find container \"96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef\": container with ID starting with 96d5c358eecc6f0e6a34cbb114324e8cac0cf6c55601e39399bb2631180e54ef not found: ID does not exist" Apr 24 17:07:32.228315 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.228294 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8"] Apr 24 17:07:32.231870 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:32.231851 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-77cb4-predictor-6cd577bd49-pnql8"] Apr 24 17:07:33.651315 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:33.651284 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" path="/var/lib/kubelet/pods/5735bf3c-3cfb-4792-825c-8d01882932ba/volumes" Apr 24 17:07:33.651746 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:33.651679 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" path="/var/lib/kubelet/pods/d51023aa-dae5-4673-8760-e8cf52d4cacb/volumes" Apr 24 17:07:35.188398 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:35.188367 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:07:35.188925 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:35.188899 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 17:07:36.190701 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:36.190675 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:07:36.191299 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:36.191270 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 17:07:39.054257 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:39.054228 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:07:40.061292 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:40.061263 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:07:45.189589 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:45.189553 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 17:07:46.191311 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:46.191272 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 17:07:55.189787 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:55.189750 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 17:07:56.192240 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:07:56.192203 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 17:08:02.470914 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.470850 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x"] Apr 24 17:08:02.471387 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.471237 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" containerID="cri-o://f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e" gracePeriod=30 Apr 24 17:08:02.471492 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.471454 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kube-rbac-proxy" containerID="cri-o://6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf" gracePeriod=30 Apr 24 17:08:02.508636 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.508605 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj"] Apr 24 17:08:02.509191 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509173 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kube-rbac-proxy" Apr 24 17:08:02.509280 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509193 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kube-rbac-proxy" Apr 24 17:08:02.509280 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509212 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" Apr 24 17:08:02.509280 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509220 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" Apr 24 17:08:02.509280 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509235 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" Apr 24 17:08:02.509280 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509243 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" Apr 24 17:08:02.509280 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509263 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kube-rbac-proxy" Apr 24 17:08:02.509280 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509270 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kube-rbac-proxy" Apr 24 17:08:02.509624 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509364 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kube-rbac-proxy" Apr 24 17:08:02.509624 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509379 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="5735bf3c-3cfb-4792-825c-8d01882932ba" containerName="kserve-container" Apr 24 17:08:02.509624 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509391 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kube-rbac-proxy" Apr 24 17:08:02.509624 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.509399 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="d51023aa-dae5-4673-8760-e8cf52d4cacb" containerName="kserve-container" Apr 24 17:08:02.521167 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.520425 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.523791 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.523362 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9c25c-kube-rbac-proxy-sar-config\"" Apr 24 17:08:02.523791 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.523612 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9c25c-predictor-serving-cert\"" Apr 24 17:08:02.526147 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.525731 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj"] Apr 24 17:08:02.538569 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.538550 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259"] Apr 24 17:08:02.538785 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.538766 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" containerID="cri-o://b8953e6ce84cea8fd0f8522647e83467ba2b4667f1ab7217d95fee1af4a13694" gracePeriod=30 Apr 24 17:08:02.538898 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.538856 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kube-rbac-proxy" containerID="cri-o://896c9fc48e23aa1fb9ad9d9b7cbdc39435f546dccb885446191c88ff8e11f2fe" gracePeriod=30 Apr 24 17:08:02.612529 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.612497 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg"] Apr 24 17:08:02.615775 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.615761 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:02.618307 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.618272 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9c25c-predictor-serving-cert\"" Apr 24 17:08:02.618307 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.618297 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9c25c-kube-rbac-proxy-sar-config\"" Apr 24 17:08:02.626181 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.626158 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg"] Apr 24 17:08:02.643873 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.643843 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfn4l\" (UniqueName: \"kubernetes.io/projected/f42d6fa9-8066-419f-8f4f-c795d797031b-kube-api-access-bfn4l\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.644079 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.644060 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.644203 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.644150 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42d6fa9-8066-419f-8f4f-c795d797031b-success-200-isvc-9c25c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.745470 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.745392 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/427c28cc-938e-40b6-b4c0-28d6ac14c441-error-404-isvc-9c25c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:02.745470 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.745465 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfn4l\" (UniqueName: \"kubernetes.io/projected/f42d6fa9-8066-419f-8f4f-c795d797031b-kube-api-access-bfn4l\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.745662 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.745485 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.745662 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.745520 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42d6fa9-8066-419f-8f4f-c795d797031b-success-200-isvc-9c25c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.745662 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:08:02.745616 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-serving-cert: secret "success-200-isvc-9c25c-predictor-serving-cert" not found Apr 24 17:08:02.745662 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.745641 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:02.745861 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:08:02.745717 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls podName:f42d6fa9-8066-419f-8f4f-c795d797031b nodeName:}" failed. No retries permitted until 2026-04-24 17:08:03.245695764 +0000 UTC m=+1752.098566459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls") pod "success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" (UID: "f42d6fa9-8066-419f-8f4f-c795d797031b") : secret "success-200-isvc-9c25c-predictor-serving-cert" not found Apr 24 17:08:02.745861 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.745764 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dblnh\" (UniqueName: \"kubernetes.io/projected/427c28cc-938e-40b6-b4c0-28d6ac14c441-kube-api-access-dblnh\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:02.746251 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.746233 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42d6fa9-8066-419f-8f4f-c795d797031b-success-200-isvc-9c25c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.754320 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.754301 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfn4l\" (UniqueName: \"kubernetes.io/projected/f42d6fa9-8066-419f-8f4f-c795d797031b-kube-api-access-bfn4l\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:02.846995 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.846965 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dblnh\" (UniqueName: \"kubernetes.io/projected/427c28cc-938e-40b6-b4c0-28d6ac14c441-kube-api-access-dblnh\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:02.847140 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.847001 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/427c28cc-938e-40b6-b4c0-28d6ac14c441-error-404-isvc-9c25c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:02.847140 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.847084 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:02.847232 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:08:02.847208 2561 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-serving-cert: secret "error-404-isvc-9c25c-predictor-serving-cert" not found Apr 24 17:08:02.847287 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:08:02.847279 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls podName:427c28cc-938e-40b6-b4c0-28d6ac14c441 nodeName:}" failed. No retries permitted until 2026-04-24 17:08:03.347260408 +0000 UTC m=+1752.200131103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls") pod "error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" (UID: "427c28cc-938e-40b6-b4c0-28d6ac14c441") : secret "error-404-isvc-9c25c-predictor-serving-cert" not found Apr 24 17:08:02.847708 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.847688 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/427c28cc-938e-40b6-b4c0-28d6ac14c441-error-404-isvc-9c25c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:02.856360 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:02.856334 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dblnh\" (UniqueName: \"kubernetes.io/projected/427c28cc-938e-40b6-b4c0-28d6ac14c441-kube-api-access-dblnh\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:03.250782 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.250748 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:03.253042 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.253020 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls\") pod \"success-200-isvc-9c25c-predictor-69b79585f6-6nvqj\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:03.294929 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.294901 2561 generic.go:358] "Generic (PLEG): container finished" podID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerID="896c9fc48e23aa1fb9ad9d9b7cbdc39435f546dccb885446191c88ff8e11f2fe" exitCode=2 Apr 24 17:08:03.295083 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.294976 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" event={"ID":"ded5d261-7d83-4e14-bcb0-56c408e51992","Type":"ContainerDied","Data":"896c9fc48e23aa1fb9ad9d9b7cbdc39435f546dccb885446191c88ff8e11f2fe"} Apr 24 17:08:03.296310 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.296287 2561 generic.go:358] "Generic (PLEG): container finished" podID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerID="6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf" exitCode=2 Apr 24 17:08:03.296426 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.296357 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" event={"ID":"ae47a928-b7d6-4a05-9a47-2cf95df423ca","Type":"ContainerDied","Data":"6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf"} Apr 24 17:08:03.351463 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.351444 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:03.353679 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.353658 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls\") pod \"error-404-isvc-9c25c-predictor-577bbdbf95-47mlg\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:03.436592 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.436561 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:03.530056 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.530024 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:03.558481 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.558451 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj"] Apr 24 17:08:03.559755 ip-10-0-128-44 kubenswrapper[2561]: W0424 17:08:03.559726 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf42d6fa9_8066_419f_8f4f_c795d797031b.slice/crio-ef4d2450624702ff069c39cf29454418162f9b45bba9d846ae5ad38426268009 WatchSource:0}: Error finding container ef4d2450624702ff069c39cf29454418162f9b45bba9d846ae5ad38426268009: Status 404 returned error can't find the container with id ef4d2450624702ff069c39cf29454418162f9b45bba9d846ae5ad38426268009 Apr 24 17:08:03.652528 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:03.652504 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg"] Apr 24 17:08:03.654373 ip-10-0-128-44 kubenswrapper[2561]: W0424 17:08:03.654350 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod427c28cc_938e_40b6_b4c0_28d6ac14c441.slice/crio-2690b97336cd1fc32a184a986c8a3a138905f098e6339842dc2e8b512d5514cd WatchSource:0}: Error finding container 2690b97336cd1fc32a184a986c8a3a138905f098e6339842dc2e8b512d5514cd: Status 404 returned error can't find the container with id 2690b97336cd1fc32a184a986c8a3a138905f098e6339842dc2e8b512d5514cd Apr 24 17:08:04.049918 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.049840 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.43:8643/healthz\": dial tcp 10.133.0.43:8643: connect: connection refused" Apr 24 17:08:04.301561 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.301470 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" event={"ID":"f42d6fa9-8066-419f-8f4f-c795d797031b","Type":"ContainerStarted","Data":"588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44"} Apr 24 17:08:04.301561 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.301509 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" event={"ID":"f42d6fa9-8066-419f-8f4f-c795d797031b","Type":"ContainerStarted","Data":"32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058"} Apr 24 17:08:04.301561 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.301526 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:04.301561 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.301537 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" event={"ID":"f42d6fa9-8066-419f-8f4f-c795d797031b","Type":"ContainerStarted","Data":"ef4d2450624702ff069c39cf29454418162f9b45bba9d846ae5ad38426268009"} Apr 24 17:08:04.301561 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.301549 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:04.303082 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.303054 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 24 17:08:04.303366 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.303340 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" event={"ID":"427c28cc-938e-40b6-b4c0-28d6ac14c441","Type":"ContainerStarted","Data":"50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53"} Apr 24 17:08:04.303460 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.303372 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" event={"ID":"427c28cc-938e-40b6-b4c0-28d6ac14c441","Type":"ContainerStarted","Data":"ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6"} Apr 24 17:08:04.303460 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.303385 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" event={"ID":"427c28cc-938e-40b6-b4c0-28d6ac14c441","Type":"ContainerStarted","Data":"2690b97336cd1fc32a184a986c8a3a138905f098e6339842dc2e8b512d5514cd"} Apr 24 17:08:04.303545 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.303466 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:04.318930 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.318887 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podStartSLOduration=2.31887574 podStartE2EDuration="2.31887574s" podCreationTimestamp="2026-04-24 17:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:08:04.318458057 +0000 UTC m=+1753.171328767" watchObservedRunningTime="2026-04-24 17:08:04.31887574 +0000 UTC m=+1753.171746525" Apr 24 17:08:04.336001 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:04.335958 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podStartSLOduration=2.335944387 podStartE2EDuration="2.335944387s" podCreationTimestamp="2026-04-24 17:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:08:04.334319636 +0000 UTC m=+1753.187190347" watchObservedRunningTime="2026-04-24 17:08:04.335944387 +0000 UTC m=+1753.188815101" Apr 24 17:08:05.057217 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:05.057180 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.44:8643/healthz\": dial tcp 10.133.0.44:8643: connect: connection refused" Apr 24 17:08:05.189821 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:05.189784 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 24 17:08:05.306641 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:05.306599 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 24 17:08:05.306810 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:05.306756 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:05.308076 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:05.308013 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 17:08:06.191585 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.191555 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 24 17:08:06.310100 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.310076 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:08:06.310726 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.310707 2561 generic.go:358] "Generic (PLEG): container finished" podID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerID="b8953e6ce84cea8fd0f8522647e83467ba2b4667f1ab7217d95fee1af4a13694" exitCode=0 Apr 24 17:08:06.310802 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.310778 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" event={"ID":"ded5d261-7d83-4e14-bcb0-56c408e51992","Type":"ContainerDied","Data":"b8953e6ce84cea8fd0f8522647e83467ba2b4667f1ab7217d95fee1af4a13694"} Apr 24 17:08:06.312045 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.312022 2561 generic.go:358] "Generic (PLEG): container finished" podID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerID="f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e" exitCode=0 Apr 24 17:08:06.312165 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.312090 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" Apr 24 17:08:06.312165 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.312101 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" event={"ID":"ae47a928-b7d6-4a05-9a47-2cf95df423ca","Type":"ContainerDied","Data":"f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e"} Apr 24 17:08:06.312165 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.312146 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x" event={"ID":"ae47a928-b7d6-4a05-9a47-2cf95df423ca","Type":"ContainerDied","Data":"bae4aa7f8c0234723c42b954c446306eef2154a0415ac56ce623830664cf31a0"} Apr 24 17:08:06.312165 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.312162 2561 scope.go:117] "RemoveContainer" containerID="6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf" Apr 24 17:08:06.312559 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.312537 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 17:08:06.319263 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.319248 2561 scope.go:117] "RemoveContainer" containerID="f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e" Apr 24 17:08:06.328967 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.328945 2561 scope.go:117] "RemoveContainer" containerID="6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf" Apr 24 17:08:06.329259 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:08:06.329235 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf\": container with ID starting with 6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf not found: ID does not exist" containerID="6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf" Apr 24 17:08:06.329325 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.329270 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf"} err="failed to get container status \"6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf\": rpc error: code = NotFound desc = could not find container \"6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf\": container with ID starting with 6727c47cd0d69d8c0e967dffbe02f602b3280fef91e276ead6454bc54156fdcf not found: ID does not exist" Apr 24 17:08:06.329325 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.329291 2561 scope.go:117] "RemoveContainer" containerID="f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e" Apr 24 17:08:06.329554 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:08:06.329537 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e\": container with ID starting with f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e not found: ID does not exist" containerID="f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e" Apr 24 17:08:06.329589 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.329561 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e"} err="failed to get container status \"f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e\": rpc error: code = NotFound desc = could not find container \"f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e\": container with ID starting with f0f997b2a62569b9b502cff352c3f05e99eccbb4c557c12543be5a6533eb0c0e not found: ID does not exist" Apr 24 17:08:06.377193 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.377168 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5wfw\" (UniqueName: \"kubernetes.io/projected/ae47a928-b7d6-4a05-9a47-2cf95df423ca-kube-api-access-f5wfw\") pod \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " Apr 24 17:08:06.377315 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.377207 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae47a928-b7d6-4a05-9a47-2cf95df423ca-success-200-isvc-3827a-kube-rbac-proxy-sar-config\") pod \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " Apr 24 17:08:06.377315 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.377253 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae47a928-b7d6-4a05-9a47-2cf95df423ca-proxy-tls\") pod \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\" (UID: \"ae47a928-b7d6-4a05-9a47-2cf95df423ca\") " Apr 24 17:08:06.377618 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.377586 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae47a928-b7d6-4a05-9a47-2cf95df423ca-success-200-isvc-3827a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-3827a-kube-rbac-proxy-sar-config") pod "ae47a928-b7d6-4a05-9a47-2cf95df423ca" (UID: "ae47a928-b7d6-4a05-9a47-2cf95df423ca"). InnerVolumeSpecName "success-200-isvc-3827a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:08:06.379177 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.379154 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae47a928-b7d6-4a05-9a47-2cf95df423ca-kube-api-access-f5wfw" (OuterVolumeSpecName: "kube-api-access-f5wfw") pod "ae47a928-b7d6-4a05-9a47-2cf95df423ca" (UID: "ae47a928-b7d6-4a05-9a47-2cf95df423ca"). InnerVolumeSpecName "kube-api-access-f5wfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:08:06.379289 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.379194 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae47a928-b7d6-4a05-9a47-2cf95df423ca-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ae47a928-b7d6-4a05-9a47-2cf95df423ca" (UID: "ae47a928-b7d6-4a05-9a47-2cf95df423ca"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:08:06.477886 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.477852 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae47a928-b7d6-4a05-9a47-2cf95df423ca-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:08:06.477886 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.477890 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f5wfw\" (UniqueName: \"kubernetes.io/projected/ae47a928-b7d6-4a05-9a47-2cf95df423ca-kube-api-access-f5wfw\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:08:06.478074 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.477906 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ae47a928-b7d6-4a05-9a47-2cf95df423ca-success-200-isvc-3827a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:08:06.585392 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.585370 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:08:06.633017 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.632989 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x"] Apr 24 17:08:06.636469 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.636448 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3827a-predictor-7f98b8946b-25n6x"] Apr 24 17:08:06.679038 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.679013 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl4x4\" (UniqueName: \"kubernetes.io/projected/ded5d261-7d83-4e14-bcb0-56c408e51992-kube-api-access-tl4x4\") pod \"ded5d261-7d83-4e14-bcb0-56c408e51992\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " Apr 24 17:08:06.679197 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.679082 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ded5d261-7d83-4e14-bcb0-56c408e51992-proxy-tls\") pod \"ded5d261-7d83-4e14-bcb0-56c408e51992\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " Apr 24 17:08:06.679197 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.679109 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ded5d261-7d83-4e14-bcb0-56c408e51992-error-404-isvc-3827a-kube-rbac-proxy-sar-config\") pod \"ded5d261-7d83-4e14-bcb0-56c408e51992\" (UID: \"ded5d261-7d83-4e14-bcb0-56c408e51992\") " Apr 24 17:08:06.679495 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.679468 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded5d261-7d83-4e14-bcb0-56c408e51992-error-404-isvc-3827a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-3827a-kube-rbac-proxy-sar-config") pod "ded5d261-7d83-4e14-bcb0-56c408e51992" (UID: "ded5d261-7d83-4e14-bcb0-56c408e51992"). InnerVolumeSpecName "error-404-isvc-3827a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:08:06.680954 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.680929 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded5d261-7d83-4e14-bcb0-56c408e51992-kube-api-access-tl4x4" (OuterVolumeSpecName: "kube-api-access-tl4x4") pod "ded5d261-7d83-4e14-bcb0-56c408e51992" (UID: "ded5d261-7d83-4e14-bcb0-56c408e51992"). InnerVolumeSpecName "kube-api-access-tl4x4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:08:06.681022 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.680929 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded5d261-7d83-4e14-bcb0-56c408e51992-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ded5d261-7d83-4e14-bcb0-56c408e51992" (UID: "ded5d261-7d83-4e14-bcb0-56c408e51992"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:08:06.780448 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.780360 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tl4x4\" (UniqueName: \"kubernetes.io/projected/ded5d261-7d83-4e14-bcb0-56c408e51992-kube-api-access-tl4x4\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:08:06.780448 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.780397 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ded5d261-7d83-4e14-bcb0-56c408e51992-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:08:06.780448 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:06.780416 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-3827a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ded5d261-7d83-4e14-bcb0-56c408e51992-error-404-isvc-3827a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:08:07.317422 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:07.317397 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" Apr 24 17:08:07.317878 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:07.317395 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259" event={"ID":"ded5d261-7d83-4e14-bcb0-56c408e51992","Type":"ContainerDied","Data":"fe759162975bdcc041e9d77ce311ec154362f130fae7728f4d195ec4c0c5849a"} Apr 24 17:08:07.317878 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:07.317514 2561 scope.go:117] "RemoveContainer" containerID="896c9fc48e23aa1fb9ad9d9b7cbdc39435f546dccb885446191c88ff8e11f2fe" Apr 24 17:08:07.325316 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:07.325191 2561 scope.go:117] "RemoveContainer" containerID="b8953e6ce84cea8fd0f8522647e83467ba2b4667f1ab7217d95fee1af4a13694" Apr 24 17:08:07.338809 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:07.338781 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259"] Apr 24 17:08:07.342617 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:07.342596 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3827a-predictor-69b76c5cb9-hq259"] Apr 24 17:08:07.652451 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:07.652378 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" path="/var/lib/kubelet/pods/ae47a928-b7d6-4a05-9a47-2cf95df423ca/volumes" Apr 24 17:08:07.652763 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:07.652751 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" path="/var/lib/kubelet/pods/ded5d261-7d83-4e14-bcb0-56c408e51992/volumes" Apr 24 17:08:10.310664 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:10.310639 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:10.311037 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:10.311015 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 24 17:08:11.317089 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:11.317057 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:08:11.317529 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:11.317505 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 17:08:15.189260 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:15.189230 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:08:16.191896 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:16.191867 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:08:20.311182 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:20.311144 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 24 17:08:21.317585 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:21.317544 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 17:08:30.311653 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:30.311616 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 24 17:08:31.317845 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:31.317810 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 17:08:40.311840 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:40.311804 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 24 17:08:41.317865 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:41.317830 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 24 17:08:50.311855 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:50.311828 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:08:51.318926 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:08:51.318892 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:17:17.435869 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:17.435838 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj"] Apr 24 17:17:17.436360 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:17.436104 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" containerID="cri-o://32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058" gracePeriod=30 Apr 24 17:17:17.436360 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:17.436148 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kube-rbac-proxy" containerID="cri-o://588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44" gracePeriod=30 Apr 24 17:17:17.475671 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:17.475646 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg"] Apr 24 17:17:17.475921 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:17.475898 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" containerID="cri-o://ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6" gracePeriod=30 Apr 24 17:17:17.476001 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:17.475914 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kube-rbac-proxy" containerID="cri-o://50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53" gracePeriod=30 Apr 24 17:17:18.021349 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:18.021312 2561 generic.go:358] "Generic (PLEG): container finished" podID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerID="50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53" exitCode=2 Apr 24 17:17:18.021515 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:18.021374 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" event={"ID":"427c28cc-938e-40b6-b4c0-28d6ac14c441","Type":"ContainerDied","Data":"50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53"} Apr 24 17:17:18.022766 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:18.022744 2561 generic.go:358] "Generic (PLEG): container finished" podID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerID="588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44" exitCode=2 Apr 24 17:17:18.022878 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:18.022812 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" event={"ID":"f42d6fa9-8066-419f-8f4f-c795d797031b","Type":"ContainerDied","Data":"588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44"} Apr 24 17:17:20.307384 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.307346 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.47:8643/healthz\": dial tcp 10.133.0.47:8643: connect: connection refused" Apr 24 17:17:20.311695 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.311673 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.47:8080: connect: connection refused" Apr 24 17:17:20.581896 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.581878 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:17:20.591046 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.591029 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:17:20.740284 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.740253 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfn4l\" (UniqueName: \"kubernetes.io/projected/f42d6fa9-8066-419f-8f4f-c795d797031b-kube-api-access-bfn4l\") pod \"f42d6fa9-8066-419f-8f4f-c795d797031b\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " Apr 24 17:17:20.740459 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.740291 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dblnh\" (UniqueName: \"kubernetes.io/projected/427c28cc-938e-40b6-b4c0-28d6ac14c441-kube-api-access-dblnh\") pod \"427c28cc-938e-40b6-b4c0-28d6ac14c441\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " Apr 24 17:17:20.740459 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.740312 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42d6fa9-8066-419f-8f4f-c795d797031b-success-200-isvc-9c25c-kube-rbac-proxy-sar-config\") pod \"f42d6fa9-8066-419f-8f4f-c795d797031b\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " Apr 24 17:17:20.740459 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.740340 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/427c28cc-938e-40b6-b4c0-28d6ac14c441-error-404-isvc-9c25c-kube-rbac-proxy-sar-config\") pod \"427c28cc-938e-40b6-b4c0-28d6ac14c441\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " Apr 24 17:17:20.740459 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.740365 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls\") pod \"f42d6fa9-8066-419f-8f4f-c795d797031b\" (UID: \"f42d6fa9-8066-419f-8f4f-c795d797031b\") " Apr 24 17:17:20.740459 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.740386 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls\") pod \"427c28cc-938e-40b6-b4c0-28d6ac14c441\" (UID: \"427c28cc-938e-40b6-b4c0-28d6ac14c441\") " Apr 24 17:17:20.740718 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.740678 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f42d6fa9-8066-419f-8f4f-c795d797031b-success-200-isvc-9c25c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-9c25c-kube-rbac-proxy-sar-config") pod "f42d6fa9-8066-419f-8f4f-c795d797031b" (UID: "f42d6fa9-8066-419f-8f4f-c795d797031b"). InnerVolumeSpecName "success-200-isvc-9c25c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:17:20.740774 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.740741 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427c28cc-938e-40b6-b4c0-28d6ac14c441-error-404-isvc-9c25c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-9c25c-kube-rbac-proxy-sar-config") pod "427c28cc-938e-40b6-b4c0-28d6ac14c441" (UID: "427c28cc-938e-40b6-b4c0-28d6ac14c441"). InnerVolumeSpecName "error-404-isvc-9c25c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:17:20.742440 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.742411 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42d6fa9-8066-419f-8f4f-c795d797031b-kube-api-access-bfn4l" (OuterVolumeSpecName: "kube-api-access-bfn4l") pod "f42d6fa9-8066-419f-8f4f-c795d797031b" (UID: "f42d6fa9-8066-419f-8f4f-c795d797031b"). InnerVolumeSpecName "kube-api-access-bfn4l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:17:20.742440 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.742434 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f42d6fa9-8066-419f-8f4f-c795d797031b" (UID: "f42d6fa9-8066-419f-8f4f-c795d797031b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:17:20.742567 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.742457 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427c28cc-938e-40b6-b4c0-28d6ac14c441-kube-api-access-dblnh" (OuterVolumeSpecName: "kube-api-access-dblnh") pod "427c28cc-938e-40b6-b4c0-28d6ac14c441" (UID: "427c28cc-938e-40b6-b4c0-28d6ac14c441"). InnerVolumeSpecName "kube-api-access-dblnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:17:20.742567 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.742466 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "427c28cc-938e-40b6-b4c0-28d6ac14c441" (UID: "427c28cc-938e-40b6-b4c0-28d6ac14c441"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:17:20.841657 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.841606 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/427c28cc-938e-40b6-b4c0-28d6ac14c441-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:17:20.841657 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.841624 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bfn4l\" (UniqueName: \"kubernetes.io/projected/f42d6fa9-8066-419f-8f4f-c795d797031b-kube-api-access-bfn4l\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:17:20.841657 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.841634 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dblnh\" (UniqueName: \"kubernetes.io/projected/427c28cc-938e-40b6-b4c0-28d6ac14c441-kube-api-access-dblnh\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:17:20.841657 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.841643 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42d6fa9-8066-419f-8f4f-c795d797031b-success-200-isvc-9c25c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:17:20.841657 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.841653 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-9c25c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/427c28cc-938e-40b6-b4c0-28d6ac14c441-error-404-isvc-9c25c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:17:20.841866 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:20.841662 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42d6fa9-8066-419f-8f4f-c795d797031b-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:17:21.033925 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.033896 2561 generic.go:358] "Generic (PLEG): container finished" podID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerID="ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6" exitCode=0 Apr 24 17:17:21.034023 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.033969 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" Apr 24 17:17:21.034023 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.033985 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" event={"ID":"427c28cc-938e-40b6-b4c0-28d6ac14c441","Type":"ContainerDied","Data":"ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6"} Apr 24 17:17:21.034023 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.034016 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg" event={"ID":"427c28cc-938e-40b6-b4c0-28d6ac14c441","Type":"ContainerDied","Data":"2690b97336cd1fc32a184a986c8a3a138905f098e6339842dc2e8b512d5514cd"} Apr 24 17:17:21.034199 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.034030 2561 scope.go:117] "RemoveContainer" containerID="50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53" Apr 24 17:17:21.035375 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.035349 2561 generic.go:358] "Generic (PLEG): container finished" podID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerID="32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058" exitCode=0 Apr 24 17:17:21.035471 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.035391 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" event={"ID":"f42d6fa9-8066-419f-8f4f-c795d797031b","Type":"ContainerDied","Data":"32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058"} Apr 24 17:17:21.035471 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.035423 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" event={"ID":"f42d6fa9-8066-419f-8f4f-c795d797031b","Type":"ContainerDied","Data":"ef4d2450624702ff069c39cf29454418162f9b45bba9d846ae5ad38426268009"} Apr 24 17:17:21.035471 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.035441 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj" Apr 24 17:17:21.041765 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.041747 2561 scope.go:117] "RemoveContainer" containerID="ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6" Apr 24 17:17:21.049812 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.049781 2561 scope.go:117] "RemoveContainer" containerID="50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53" Apr 24 17:17:21.050140 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:17:21.050086 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53\": container with ID starting with 50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53 not found: ID does not exist" containerID="50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53" Apr 24 17:17:21.050229 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.050161 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53"} err="failed to get container status \"50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53\": rpc error: code = NotFound desc = could not find container \"50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53\": container with ID starting with 50cc3c58405901429b978c36eb74beb44d914aff69aa44fcb8c4231594171f53 not found: ID does not exist" Apr 24 17:17:21.050229 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.050205 2561 scope.go:117] "RemoveContainer" containerID="ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6" Apr 24 17:17:21.050654 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:17:21.050530 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6\": container with ID starting with ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6 not found: ID does not exist" containerID="ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6" Apr 24 17:17:21.050732 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.050673 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6"} err="failed to get container status \"ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6\": rpc error: code = NotFound desc = could not find container \"ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6\": container with ID starting with ea5af9a8542e1555bd9e850b8ea8aa382b887e070f64010ef5d94d8f10ba44d6 not found: ID does not exist" Apr 24 17:17:21.050732 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.050697 2561 scope.go:117] "RemoveContainer" containerID="588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44" Apr 24 17:17:21.059712 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.059694 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg"] Apr 24 17:17:21.062026 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.062006 2561 scope.go:117] "RemoveContainer" containerID="32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058" Apr 24 17:17:21.065710 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.065689 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9c25c-predictor-577bbdbf95-47mlg"] Apr 24 17:17:21.068938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.068922 2561 scope.go:117] "RemoveContainer" containerID="588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44" Apr 24 17:17:21.069193 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:17:21.069170 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44\": container with ID starting with 588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44 not found: ID does not exist" containerID="588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44" Apr 24 17:17:21.069237 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.069202 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44"} err="failed to get container status \"588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44\": rpc error: code = NotFound desc = could not find container \"588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44\": container with ID starting with 588faf0db07844d86a23a6eafb15d4892e3b4246278d46a7dd89a4e648619a44 not found: ID does not exist" Apr 24 17:17:21.069237 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.069224 2561 scope.go:117] "RemoveContainer" containerID="32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058" Apr 24 17:17:21.069436 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:17:21.069419 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058\": container with ID starting with 32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058 not found: ID does not exist" containerID="32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058" Apr 24 17:17:21.069478 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.069442 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058"} err="failed to get container status \"32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058\": rpc error: code = NotFound desc = could not find container \"32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058\": container with ID starting with 32a464e72e4c9b4eb7d0d0bc4226b4809ec76733fa394d908295b83f335b9058 not found: ID does not exist" Apr 24 17:17:21.076329 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.076307 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj"] Apr 24 17:17:21.077539 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.077521 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c25c-predictor-69b79585f6-6nvqj"] Apr 24 17:17:21.651092 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.651025 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" path="/var/lib/kubelet/pods/427c28cc-938e-40b6-b4c0-28d6ac14c441/volumes" Apr 24 17:17:21.651437 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:17:21.651423 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" path="/var/lib/kubelet/pods/f42d6fa9-8066-419f-8f4f-c795d797031b/volumes" Apr 24 17:24:47.373191 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:47.373154 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679"] Apr 24 17:24:47.373686 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:47.373526 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" containerID="cri-o://a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe" gracePeriod=30 Apr 24 17:24:47.373686 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:47.373556 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kube-rbac-proxy" containerID="cri-o://934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374" gracePeriod=30 Apr 24 17:24:47.423371 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:47.423346 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj"] Apr 24 17:24:47.423631 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:47.423610 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" containerID="cri-o://a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08" gracePeriod=30 Apr 24 17:24:47.423764 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:47.423715 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kube-rbac-proxy" containerID="cri-o://995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca" gracePeriod=30 Apr 24 17:24:48.359480 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:48.359450 2561 generic.go:358] "Generic (PLEG): container finished" podID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerID="934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374" exitCode=2 Apr 24 17:24:48.359650 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:48.359519 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" event={"ID":"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0","Type":"ContainerDied","Data":"934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374"} Apr 24 17:24:48.361040 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:48.361017 2561 generic.go:358] "Generic (PLEG): container finished" podID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerID="995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca" exitCode=2 Apr 24 17:24:48.361161 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:48.361055 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" event={"ID":"81afd582-5ad6-4837-93c1-20edce9f24c9","Type":"ContainerDied","Data":"995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca"} Apr 24 17:24:50.149057 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.149032 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:24:50.179111 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.179091 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:24:50.282525 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.282468 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81afd582-5ad6-4837-93c1-20edce9f24c9-proxy-tls\") pod \"81afd582-5ad6-4837-93c1-20edce9f24c9\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " Apr 24 17:24:50.282525 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.282505 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ndmb\" (UniqueName: \"kubernetes.io/projected/81afd582-5ad6-4837-93c1-20edce9f24c9-kube-api-access-9ndmb\") pod \"81afd582-5ad6-4837-93c1-20edce9f24c9\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " Apr 24 17:24:50.282525 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.282523 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-proxy-tls\") pod \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " Apr 24 17:24:50.282751 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.282541 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-success-200-isvc-8edb5-kube-rbac-proxy-sar-config\") pod \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " Apr 24 17:24:50.282751 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.282656 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81afd582-5ad6-4837-93c1-20edce9f24c9-error-404-isvc-8edb5-kube-rbac-proxy-sar-config\") pod \"81afd582-5ad6-4837-93c1-20edce9f24c9\" (UID: \"81afd582-5ad6-4837-93c1-20edce9f24c9\") " Apr 24 17:24:50.282751 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.282739 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcf94\" (UniqueName: \"kubernetes.io/projected/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-kube-api-access-zcf94\") pod \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\" (UID: \"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0\") " Apr 24 17:24:50.282991 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.282867 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-success-200-isvc-8edb5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-8edb5-kube-rbac-proxy-sar-config") pod "8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" (UID: "8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0"). InnerVolumeSpecName "success-200-isvc-8edb5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:24:50.283106 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.283016 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81afd582-5ad6-4837-93c1-20edce9f24c9-error-404-isvc-8edb5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-8edb5-kube-rbac-proxy-sar-config") pod "81afd582-5ad6-4837-93c1-20edce9f24c9" (UID: "81afd582-5ad6-4837-93c1-20edce9f24c9"). InnerVolumeSpecName "error-404-isvc-8edb5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:24:50.284533 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.284511 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81afd582-5ad6-4837-93c1-20edce9f24c9-kube-api-access-9ndmb" (OuterVolumeSpecName: "kube-api-access-9ndmb") pod "81afd582-5ad6-4837-93c1-20edce9f24c9" (UID: "81afd582-5ad6-4837-93c1-20edce9f24c9"). InnerVolumeSpecName "kube-api-access-9ndmb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:24:50.284619 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.284597 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-kube-api-access-zcf94" (OuterVolumeSpecName: "kube-api-access-zcf94") pod "8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" (UID: "8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0"). InnerVolumeSpecName "kube-api-access-zcf94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:24:50.284879 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.284863 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" (UID: "8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:24:50.284923 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.284878 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81afd582-5ad6-4837-93c1-20edce9f24c9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "81afd582-5ad6-4837-93c1-20edce9f24c9" (UID: "81afd582-5ad6-4837-93c1-20edce9f24c9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:24:50.369684 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.369656 2561 generic.go:358] "Generic (PLEG): container finished" podID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerID="a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08" exitCode=0 Apr 24 17:24:50.369792 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.369732 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" Apr 24 17:24:50.369792 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.369741 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" event={"ID":"81afd582-5ad6-4837-93c1-20edce9f24c9","Type":"ContainerDied","Data":"a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08"} Apr 24 17:24:50.369792 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.369775 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj" event={"ID":"81afd582-5ad6-4837-93c1-20edce9f24c9","Type":"ContainerDied","Data":"32e03ab0ffd0565b61d6c875d83677a6cfe09475b3a2b5c1a743bcfbefc3d51f"} Apr 24 17:24:50.369937 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.369796 2561 scope.go:117] "RemoveContainer" containerID="995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca" Apr 24 17:24:50.371193 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.371157 2561 generic.go:358] "Generic (PLEG): container finished" podID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerID="a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe" exitCode=0 Apr 24 17:24:50.371330 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.371196 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" event={"ID":"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0","Type":"ContainerDied","Data":"a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe"} Apr 24 17:24:50.371330 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.371217 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" Apr 24 17:24:50.371330 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.371222 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679" event={"ID":"8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0","Type":"ContainerDied","Data":"144395fc912e59d9511526b310beec18c8fd0cb8f17ce496ec9aa97165cd4c5a"} Apr 24 17:24:50.377652 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.377634 2561 scope.go:117] "RemoveContainer" containerID="a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08" Apr 24 17:24:50.383379 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.383361 2561 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-success-200-isvc-8edb5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:24:50.383475 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.383384 2561 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-8edb5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81afd582-5ad6-4837-93c1-20edce9f24c9-error-404-isvc-8edb5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:24:50.383475 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.383401 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zcf94\" (UniqueName: \"kubernetes.io/projected/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-kube-api-access-zcf94\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:24:50.383475 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.383415 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81afd582-5ad6-4837-93c1-20edce9f24c9-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:24:50.383475 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.383429 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9ndmb\" (UniqueName: \"kubernetes.io/projected/81afd582-5ad6-4837-93c1-20edce9f24c9-kube-api-access-9ndmb\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:24:50.383475 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.383441 2561 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0-proxy-tls\") on node \"ip-10-0-128-44.ec2.internal\" DevicePath \"\"" Apr 24 17:24:50.384920 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.384903 2561 scope.go:117] "RemoveContainer" containerID="995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca" Apr 24 17:24:50.385174 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:24:50.385157 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca\": container with ID starting with 995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca not found: ID does not exist" containerID="995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca" Apr 24 17:24:50.385223 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.385181 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca"} err="failed to get container status \"995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca\": rpc error: code = NotFound desc = could not find container \"995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca\": container with ID starting with 995117ead0b8e926cba1dbc25119385da76cccbd5a362e9d65d570461b6f2dca not found: ID does not exist" Apr 24 17:24:50.385223 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.385197 2561 scope.go:117] "RemoveContainer" containerID="a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08" Apr 24 17:24:50.385425 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:24:50.385408 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08\": container with ID starting with a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08 not found: ID does not exist" containerID="a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08" Apr 24 17:24:50.385460 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.385431 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08"} err="failed to get container status \"a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08\": rpc error: code = NotFound desc = could not find container \"a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08\": container with ID starting with a47057cf2b3919c0aef2ffae02742b6ead2ebe4703801a952d55df6a57c9aa08 not found: ID does not exist" Apr 24 17:24:50.385460 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.385446 2561 scope.go:117] "RemoveContainer" containerID="934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374" Apr 24 17:24:50.392366 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.392345 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj"] Apr 24 17:24:50.392670 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.392653 2561 scope.go:117] "RemoveContainer" containerID="a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe" Apr 24 17:24:50.397916 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.397890 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8edb5-predictor-667c7d688c-fwxdj"] Apr 24 17:24:50.403320 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.403304 2561 scope.go:117] "RemoveContainer" containerID="934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374" Apr 24 17:24:50.403575 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:24:50.403558 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374\": container with ID starting with 934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374 not found: ID does not exist" containerID="934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374" Apr 24 17:24:50.403625 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.403582 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374"} err="failed to get container status \"934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374\": rpc error: code = NotFound desc = could not find container \"934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374\": container with ID starting with 934bcf2698d1f2a605e9931c53abd38cd5e8fcea92ac6d5a3f0891de25502374 not found: ID does not exist" Apr 24 17:24:50.403625 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.403597 2561 scope.go:117] "RemoveContainer" containerID="a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe" Apr 24 17:24:50.403824 ip-10-0-128-44 kubenswrapper[2561]: E0424 17:24:50.403807 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe\": container with ID starting with a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe not found: ID does not exist" containerID="a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe" Apr 24 17:24:50.403871 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.403829 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe"} err="failed to get container status \"a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe\": rpc error: code = NotFound desc = could not find container \"a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe\": container with ID starting with a834a3a79f7332c36f97e0c22d2d4b765598945ce86f18658d4bcd619cd890fe not found: ID does not exist" Apr 24 17:24:50.408918 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.408898 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679"] Apr 24 17:24:50.414030 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:50.414011 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8edb5-predictor-7666dfcc56-g6679"] Apr 24 17:24:51.651568 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:51.651533 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" path="/var/lib/kubelet/pods/81afd582-5ad6-4837-93c1-20edce9f24c9/volumes" Apr 24 17:24:51.651935 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:24:51.651923 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" path="/var/lib/kubelet/pods/8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0/volumes" Apr 24 17:25:17.917780 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:17.917701 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-whm4n_608b6244-9d8e-4eea-86b2-64a5cf06e100/global-pull-secret-syncer/0.log" Apr 24 17:25:17.972642 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:17.972606 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9bw4h_5360254e-a924-4ddf-8043-9c43f8152fe3/konnectivity-agent/0.log" Apr 24 17:25:18.066226 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:18.066205 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-44.ec2.internal_7c14f0193c74d30d8fc8bec912702c00/haproxy/0.log" Apr 24 17:25:21.801027 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:21.800995 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-94xpm_cde6ad00-395f-4d91-befe-1d950c0d7182/monitoring-plugin/0.log" Apr 24 17:25:21.833505 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:21.833480 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4r2vk_0537ffed-f4ae-49a0-a7ef-a27e012a41ac/node-exporter/0.log" Apr 24 17:25:21.855278 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:21.855248 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4r2vk_0537ffed-f4ae-49a0-a7ef-a27e012a41ac/kube-rbac-proxy/0.log" Apr 24 17:25:21.876210 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:21.876191 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4r2vk_0537ffed-f4ae-49a0-a7ef-a27e012a41ac/init-textfile/0.log" Apr 24 17:25:23.731954 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:23.731923 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-mr4bh_e7b94c4f-1b89-4a52-ba90-78436120bab7/networking-console-plugin/0.log" Apr 24 17:25:24.607544 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:24.607514 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-pq8gb_9be54bef-8146-4560-aef7-387e885e8227/download-server/0.log" Apr 24 17:25:25.273086 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273056 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm"] Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273368 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kube-rbac-proxy" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273380 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kube-rbac-proxy" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273389 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273394 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273401 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273406 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273413 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273418 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273427 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kube-rbac-proxy" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273432 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kube-rbac-proxy" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273442 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kube-rbac-proxy" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273447 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kube-rbac-proxy" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273454 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273459 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" Apr 24 17:25:25.273457 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273464 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273469 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273479 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273484 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273488 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273493 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273499 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273506 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273511 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273515 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273560 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273568 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273574 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273580 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273586 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="f42d6fa9-8066-419f-8f4f-c795d797031b" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273591 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="427c28cc-938e-40b6-b4c0-28d6ac14c441" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273597 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae47a928-b7d6-4a05-9a47-2cf95df423ca" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273602 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273608 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kube-rbac-proxy" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273613 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b25dd1a-05c9-4bc1-9002-cc05dbe16ab0" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273618 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="81afd582-5ad6-4837-93c1-20edce9f24c9" containerName="kserve-container" Apr 24 17:25:25.273938 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.273625 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="ded5d261-7d83-4e14-bcb0-56c408e51992" containerName="kserve-container" Apr 24 17:25:25.276567 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.276553 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.278982 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.278961 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f77jl\"/\"default-dockercfg-fbxhs\"" Apr 24 17:25:25.279102 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.279082 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f77jl\"/\"openshift-service-ca.crt\"" Apr 24 17:25:25.280042 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.280027 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f77jl\"/\"kube-root-ca.crt\"" Apr 24 17:25:25.285393 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.285374 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm"] Apr 24 17:25:25.417729 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.417706 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-podres\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.417729 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.417737 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-proc\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.417894 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.417756 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-sys\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.417894 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.417832 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-lib-modules\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.417894 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.417884 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bdhp\" (UniqueName: \"kubernetes.io/projected/e03668ee-c651-46d6-9005-d3bcf182fea9-kube-api-access-5bdhp\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518359 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518335 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bdhp\" (UniqueName: \"kubernetes.io/projected/e03668ee-c651-46d6-9005-d3bcf182fea9-kube-api-access-5bdhp\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518483 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518374 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-podres\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518483 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518398 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-proc\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518483 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518421 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-sys\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518483 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518464 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-lib-modules\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518642 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518481 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-proc\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518642 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518519 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-podres\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518642 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518533 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-sys\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.518642 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.518588 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e03668ee-c651-46d6-9005-d3bcf182fea9-lib-modules\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.526190 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.526137 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bdhp\" (UniqueName: \"kubernetes.io/projected/e03668ee-c651-46d6-9005-d3bcf182fea9-kube-api-access-5bdhp\") pod \"perf-node-gather-daemonset-gjhnm\" (UID: \"e03668ee-c651-46d6-9005-d3bcf182fea9\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.586303 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.586273 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:25.698007 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.697898 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm"] Apr 24 17:25:25.700605 ip-10-0-128-44 kubenswrapper[2561]: W0424 17:25:25.700575 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode03668ee_c651_46d6_9005_d3bcf182fea9.slice/crio-982f16024924f4a8f2b72da7a477449b57ec3dbc88968e28819af1a76648f5ef WatchSource:0}: Error finding container 982f16024924f4a8f2b72da7a477449b57ec3dbc88968e28819af1a76648f5ef: Status 404 returned error can't find the container with id 982f16024924f4a8f2b72da7a477449b57ec3dbc88968e28819af1a76648f5ef Apr 24 17:25:25.702161 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.702146 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:25:25.752054 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.752035 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ncrkw_d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b/dns/0.log" Apr 24 17:25:25.773633 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.773616 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ncrkw_d8f68bdd-e4ea-40ca-a7ac-12f3eeeebd4b/kube-rbac-proxy/0.log" Apr 24 17:25:25.912485 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:25.912424 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l8hzw_e5c88c86-c895-48fe-8dd8-898499027e1f/dns-node-resolver/0.log" Apr 24 17:25:26.354603 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:26.354574 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6c787d785f-2vkbk_d56aa02a-1a5d-4ac9-b72b-f121ecc374b3/registry/0.log" Apr 24 17:25:26.444739 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:26.444707 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wcmwg_5461250a-930b-44a6-ba77-773586840e32/node-ca/0.log" Apr 24 17:25:26.479394 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:26.479362 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" event={"ID":"e03668ee-c651-46d6-9005-d3bcf182fea9","Type":"ContainerStarted","Data":"4d91c0fed6f70deca025c77ae357a54343d6aa27a2bcd57c160e82e80853da53"} Apr 24 17:25:26.479394 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:26.479391 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" event={"ID":"e03668ee-c651-46d6-9005-d3bcf182fea9","Type":"ContainerStarted","Data":"982f16024924f4a8f2b72da7a477449b57ec3dbc88968e28819af1a76648f5ef"} Apr 24 17:25:26.479544 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:26.479413 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:26.498755 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:26.498716 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" podStartSLOduration=1.4987041589999999 podStartE2EDuration="1.498704159s" podCreationTimestamp="2026-04-24 17:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:25:26.496710617 +0000 UTC m=+2795.349581330" watchObservedRunningTime="2026-04-24 17:25:26.498704159 +0000 UTC m=+2795.351574871" Apr 24 17:25:27.579989 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:27.579948 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hs8hk_d1d5372f-dc95-47d2-a6be-7d40b52c4c1d/serve-healthcheck-canary/0.log" Apr 24 17:25:28.095039 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:28.095012 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2q9c_adcfe200-f21b-4233-a47a-d3e925e32b96/kube-rbac-proxy/0.log" Apr 24 17:25:28.117226 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:28.117203 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2q9c_adcfe200-f21b-4233-a47a-d3e925e32b96/exporter/0.log" Apr 24 17:25:28.139213 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:28.139191 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2q9c_adcfe200-f21b-4233-a47a-d3e925e32b96/extractor/0.log" Apr 24 17:25:30.089247 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:30.089218 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7f7fb4c66f-gczs8_d1cf6eeb-9ea9-470c-852e-c4f262ddbca3/manager/0.log" Apr 24 17:25:30.132968 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:30.132931 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-879lh_45d6ccd2-4d56-43df-b37c-538044e628e0/server/0.log" Apr 24 17:25:30.555251 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:30.555219 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-vfd2q_6617c0d5-1e92-46e7-ba6f-65a16498f29b/manager/0.log" Apr 24 17:25:30.603853 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:30.603822 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-wxq7p_8f7987de-e1ca-4187-9ea5-6347c18b87b5/seaweedfs/0.log" Apr 24 17:25:32.491958 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:32.491936 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-gjhnm" Apr 24 17:25:34.427274 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:34.427188 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-6gd9f_e7643e3d-a60c-4b47-af12-ca48b6e20569/migrator/0.log" Apr 24 17:25:34.448593 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:34.448569 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-6gd9f_e7643e3d-a60c-4b47-af12-ca48b6e20569/graceful-termination/0.log" Apr 24 17:25:35.990969 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:35.990936 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5qxgh_3c59ab7c-c1d6-4ca5-b111-ef7591ea2ea2/kube-multus/0.log" Apr 24 17:25:36.382582 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.382558 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k7nht_0cf50135-6483-41fd-a681-60b2bfc1a66d/kube-multus-additional-cni-plugins/0.log" Apr 24 17:25:36.408129 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.408091 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k7nht_0cf50135-6483-41fd-a681-60b2bfc1a66d/egress-router-binary-copy/0.log" Apr 24 17:25:36.430799 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.430773 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k7nht_0cf50135-6483-41fd-a681-60b2bfc1a66d/cni-plugins/0.log" Apr 24 17:25:36.452054 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.452035 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k7nht_0cf50135-6483-41fd-a681-60b2bfc1a66d/bond-cni-plugin/0.log" Apr 24 17:25:36.474268 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.474244 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k7nht_0cf50135-6483-41fd-a681-60b2bfc1a66d/routeoverride-cni/0.log" Apr 24 17:25:36.497507 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.497483 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k7nht_0cf50135-6483-41fd-a681-60b2bfc1a66d/whereabouts-cni-bincopy/0.log" Apr 24 17:25:36.517250 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.517228 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k7nht_0cf50135-6483-41fd-a681-60b2bfc1a66d/whereabouts-cni/0.log" Apr 24 17:25:36.568612 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.568548 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6c75x_7abff685-57e5-4da6-b220-dbc6c56835ed/network-metrics-daemon/0.log" Apr 24 17:25:36.585988 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:36.585947 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6c75x_7abff685-57e5-4da6-b220-dbc6c56835ed/kube-rbac-proxy/0.log" Apr 24 17:25:38.012778 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:38.012748 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rrwzc_57df94e3-357d-4303-9788-f5e7a9d03ae9/ovn-controller/0.log" Apr 24 17:25:38.059477 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:38.059453 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rrwzc_57df94e3-357d-4303-9788-f5e7a9d03ae9/ovn-acl-logging/0.log" Apr 24 17:25:38.083140 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:38.083102 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rrwzc_57df94e3-357d-4303-9788-f5e7a9d03ae9/kube-rbac-proxy-node/0.log" Apr 24 17:25:38.105676 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:38.105652 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rrwzc_57df94e3-357d-4303-9788-f5e7a9d03ae9/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 17:25:38.124430 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:38.124407 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rrwzc_57df94e3-357d-4303-9788-f5e7a9d03ae9/northd/0.log" Apr 24 17:25:38.146942 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:38.146923 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rrwzc_57df94e3-357d-4303-9788-f5e7a9d03ae9/nbdb/0.log" Apr 24 17:25:38.170834 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:38.170807 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rrwzc_57df94e3-357d-4303-9788-f5e7a9d03ae9/sbdb/0.log" Apr 24 17:25:38.336439 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:38.336377 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rrwzc_57df94e3-357d-4303-9788-f5e7a9d03ae9/ovnkube-controller/0.log" Apr 24 17:25:39.358443 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:39.358415 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2p9kk_f32abd88-b479-49d4-b013-a952386d695c/network-check-target-container/0.log" Apr 24 17:25:40.302279 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:40.302251 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5bm47_df14d60b-71e6-490b-a1cb-a94cefccd438/iptables-alerter/0.log" Apr 24 17:25:40.971470 ip-10-0-128-44 kubenswrapper[2561]: I0424 17:25:40.971442 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hkhzt_c71dad60-6f43-425f-9755-72c272d5116b/tuned/0.log"