Apr 28 19:13:21.496892 ip-10-0-134-36 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 28 19:13:21.496906 ip-10-0-134-36 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 28 19:13:21.496916 ip-10-0-134-36 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 28 19:13:21.497227 ip-10-0-134-36 systemd[1]: Failed to start Kubernetes Kubelet. Apr 28 19:13:31.530089 ip-10-0-134-36 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 28 19:13:31.530102 ip-10-0-134-36 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 87d3833719984b689d20d1ca08c9f2f7 -- Apr 28 19:15:58.207454 ip-10-0-134-36 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:15:58.652920 ip-10-0-134-36 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:58.652920 ip-10-0-134-36 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:15:58.652920 ip-10-0-134-36 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:58.652920 ip-10-0-134-36 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:15:58.652920 ip-10-0-134-36 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:58.655845 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.655758 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:15:58.657900 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657885 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657901 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657905 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657908 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657912 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657915 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657917 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657920 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657923 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657926 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657930 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657934 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657936 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657939 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:58.657935 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657942 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657945 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657947 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657950 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657953 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657955 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657958 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657960 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657962 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657965 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657967 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657970 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657972 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657975 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657977 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657980 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657982 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657985 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657987 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657990 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:58.658318 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657992 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657994 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657997 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.657999 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658002 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658005 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658008 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658010 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658013 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658016 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658018 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658020 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658023 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658025 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658028 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658031 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658034 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658036 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658038 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658041 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:58.658786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658044 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658046 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658049 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658051 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658053 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658056 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658058 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658061 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658063 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658066 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658068 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658070 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658073 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658075 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658078 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658080 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658082 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658085 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658087 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658089 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:58.659282 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658092 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658095 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658097 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658102 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658113 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658116 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658120 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658124 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658127 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658129 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658132 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658134 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658493 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658498 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658501 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658504 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658506 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658509 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658511 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:58.659752 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658514 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658516 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658519 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658521 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658524 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658527 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658529 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658532 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658534 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658537 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658539 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658542 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658544 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658547 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658550 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658552 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658555 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658557 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658560 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658563 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:58.660208 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658567 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658570 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658573 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658576 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658578 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658581 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658584 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658587 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658590 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658592 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658595 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658598 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658600 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658603 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658606 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658609 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658611 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658614 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658616 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658619 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:58.660689 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658621 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658623 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658626 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658628 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658631 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658635 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658638 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658641 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658644 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658646 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658649 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658652 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658655 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658657 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658660 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658662 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658664 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658667 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658669 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658672 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:58.661163 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658675 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658677 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658679 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658682 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658684 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658687 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658689 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658692 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658694 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658696 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658699 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658701 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658704 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658706 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658709 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658712 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658714 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658717 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.658719 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.659978 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:15:58.661678 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.659994 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660000 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660004 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660009 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660012 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660017 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660021 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660024 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660027 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660030 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660033 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660036 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660039 2572 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660042 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660045 2572 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660048 2572 flags.go:64] FLAG: --cloud-config="" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660050 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660053 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660056 2572 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660059 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660062 2572 flags.go:64] FLAG: --config-dir="" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660065 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660068 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660072 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:15:58.662183 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660075 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660078 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660081 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660084 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660087 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660090 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660094 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660097 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660101 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660105 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660108 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660111 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660114 2572 flags.go:64] FLAG: --enable-server="true" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660117 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660121 2572 flags.go:64] FLAG: --event-burst="100" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660124 2572 flags.go:64] FLAG: --event-qps="50" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660127 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660130 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660133 2572 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660136 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660139 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660142 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660145 2572 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660148 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660151 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:15:58.662749 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660154 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660157 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660160 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660162 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660165 2572 flags.go:64] FLAG: --feature-gates="" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660180 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660184 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660187 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660190 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660193 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660196 2572 flags.go:64] FLAG: --help="false" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660199 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-134-36.ec2.internal" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660202 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660205 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660208 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660211 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660215 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660219 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660222 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660225 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660228 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660231 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660234 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660237 2572 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:15:58.663358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660240 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660243 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660246 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660248 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660251 2572 flags.go:64] FLAG: --lock-file="" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660254 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660257 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660260 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660264 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660267 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660270 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660272 2572 flags.go:64] FLAG: --logging-format="text" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660276 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660279 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660281 2572 flags.go:64] FLAG: --manifest-url="" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660284 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660288 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660291 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660295 2572 flags.go:64] FLAG: --max-pods="110" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660298 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660301 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660304 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660307 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660310 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660313 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:15:58.663944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660316 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660324 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660327 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660330 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660333 2572 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660336 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660341 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660344 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660347 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660350 2572 flags.go:64] FLAG: --port="10250" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660353 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660356 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06cd7b4d9dee7a845" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660359 2572 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660362 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660364 2572 flags.go:64] FLAG: --register-node="true" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660367 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660370 2572 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660374 2572 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660376 2572 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660379 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660382 2572 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660385 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660388 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660391 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660394 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:15:58.664553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660396 2572 flags.go:64] FLAG: --runonce="false" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660399 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660402 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660405 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660408 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660411 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660414 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660417 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660420 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660424 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660427 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660429 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660432 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660435 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660438 2572 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660441 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660446 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660448 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660451 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660455 2572 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660457 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660460 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660463 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660465 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660468 2572 flags.go:64] FLAG: --v="2" Apr 28 19:15:58.665198 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660473 2572 flags.go:64] FLAG: --version="false" Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660477 2572 flags.go:64] FLAG: --vmodule="" Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660480 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.660483 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660570 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660574 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660577 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660579 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660582 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660584 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660587 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660589 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660592 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660594 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660596 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660599 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660601 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660604 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660607 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660610 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:58.665786 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660612 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660615 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660618 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660622 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660624 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660627 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660629 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660631 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660634 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660636 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660639 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660642 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660644 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660647 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660649 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660652 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660654 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660659 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660661 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660664 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:58.666268 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660666 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660669 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660671 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660674 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660676 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660679 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660681 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660684 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660686 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660690 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660694 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660698 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660701 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660704 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660706 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660709 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660712 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660715 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660717 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:58.666769 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660720 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660722 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660724 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660727 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660729 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660732 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660734 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660737 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660739 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660742 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660746 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660748 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660751 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660753 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660756 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660758 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660761 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660763 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660766 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660768 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:58.667519 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660772 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660774 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660777 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660779 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660783 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660785 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660788 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660790 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660792 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660795 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.660797 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:58.668089 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.661301 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:58.669902 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.669884 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:15:58.669939 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.669903 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:15:58.669969 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669950 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:58.669969 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669958 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:58.669969 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669961 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:58.669969 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669965 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:58.669969 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669968 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:58.669969 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669971 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669974 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669978 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669980 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669983 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669986 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669988 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669990 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669993 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669996 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.669998 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670001 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670003 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670006 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670008 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670010 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670013 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670015 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670017 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670020 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:58.670155 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670022 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670025 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670028 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670030 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670033 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670035 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670039 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670042 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670044 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670047 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670049 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670052 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670054 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670057 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670060 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670063 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670065 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670068 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670070 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:58.670676 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670073 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670075 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670078 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670081 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670083 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670085 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670088 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670090 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670093 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670096 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670098 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670101 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670104 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670106 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670109 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670111 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670113 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670116 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670118 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670121 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:58.671122 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670123 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670126 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670129 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670131 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670134 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670136 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670139 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670142 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670145 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670147 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670150 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670152 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670156 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670159 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670161 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670164 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670166 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670183 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670186 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:58.671597 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670189 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670192 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670194 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.670199 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670291 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670296 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670299 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670302 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670305 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670307 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670310 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670312 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670314 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670317 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670319 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:58.672039 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670322 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670324 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670326 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670329 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670331 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670334 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670336 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670339 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670341 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670344 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670346 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670350 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670353 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670355 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670357 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670359 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670362 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670364 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670367 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670369 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:58.672412 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670371 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670374 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670376 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670379 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670381 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670383 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670386 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670388 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670391 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670393 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670397 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670400 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670402 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670404 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670408 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670411 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670414 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670417 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670420 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:58.672882 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670423 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670426 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670428 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670431 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670433 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670436 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670438 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670441 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670444 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670446 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670448 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670451 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670453 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670455 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670458 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670460 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670462 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670465 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670467 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670470 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:58.673337 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670472 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670474 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670477 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670479 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670482 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670484 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670487 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670489 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670492 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670494 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670497 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670500 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670502 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670505 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670507 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:58.670509 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:58.673810 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.670514 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:58.674266 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.671131 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:15:58.674266 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.674108 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:15:58.675020 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.675009 2572 server.go:1019] "Starting client certificate rotation" Apr 28 19:15:58.675119 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.675103 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:15:58.675152 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.675141 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:15:58.698823 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.698808 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:15:58.702435 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.702418 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:15:58.714724 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.714706 2572 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:15:58.719684 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.719671 2572 log.go:25] "Validated CRI v1 image API" Apr 28 19:15:58.720862 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.720848 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:15:58.726028 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.726010 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8177f9ac-a4e0-4880-9a1a-4a8c36da5615:/dev/nvme0n1p4 c9d82b8b-c804-4f28-9079-84ca75f07d4b:/dev/nvme0n1p3] Apr 28 19:15:58.726088 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.726026 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:15:58.732284 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.732178 2572 manager.go:217] Machine: {Timestamp:2026-04-28 19:15:58.729766236 +0000 UTC m=+0.402211329 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200003 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec233f3963a4574527d182b707c81110 SystemUUID:ec233f39-63a4-5745-27d1-82b707c81110 BootID:87d38337-1998-4b68-9d20-d1ca08c9f2f7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c2:70:ac:90:79 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c2:70:ac:90:79 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:72:f1:88:33:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:15:58.732284 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.732279 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:15:58.732395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.732349 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:15:58.735098 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.735078 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:15:58.735243 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.735100 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-36.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:15:58.735286 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.735252 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:15:58.735286 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.735261 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:15:58.735286 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.735273 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:15:58.736003 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.735993 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:15:58.737390 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.737361 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:15:58.737541 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.737527 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:15:58.739485 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.739467 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:15:58.739839 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.739826 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:15:58.739877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.739848 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:15:58.739877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.739859 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:15:58.739877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.739869 2572 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:15:58.740010 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.739880 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:15:58.740953 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.740941 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:15:58.741007 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.740959 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:15:58.743820 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.743805 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:15:58.744990 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.744977 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:15:58.746764 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746752 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:15:58.746811 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746770 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:15:58.746811 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746776 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:15:58.746811 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746781 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:15:58.746811 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746789 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:15:58.746811 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746797 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:15:58.746811 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746806 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:15:58.746811 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746814 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:15:58.746983 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746820 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:15:58.746983 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746827 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:15:58.746983 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746835 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:15:58.746983 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.746843 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:15:58.748218 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.748207 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:15:58.748218 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.748217 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:15:58.751687 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.751670 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:15:58.751754 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.751714 2572 server.go:1295] "Started kubelet" Apr 28 19:15:58.751852 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.751827 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:15:58.751945 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.751893 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:15:58.752001 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.751980 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:15:58.752449 ip-10-0-134-36 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:15:58.753070 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.753051 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:15:58.753346 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.753323 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-36.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:15:58.753725 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.753705 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:15:58.753775 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.753705 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:15:58.754303 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.754291 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:15:58.757724 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.757706 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:15:58.758565 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.758343 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:15:58.759376 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.759358 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:15:58.759376 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.759360 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:15:58.759376 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.759386 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:15:58.759545 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.759448 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:15:58.759545 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.759460 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:15:58.759650 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.759561 2572 factory.go:55] Registering systemd factory Apr 28 19:15:58.759650 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.759641 2572 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:15:58.759857 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.759813 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:58.760371 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.760248 2572 factory.go:153] Registering CRI-O factory Apr 28 19:15:58.760371 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.760268 2572 factory.go:223] Registration of the crio container factory successfully Apr 28 19:15:58.760371 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.760318 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:15:58.760371 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.760337 2572 factory.go:103] Registering Raw factory Apr 28 19:15:58.760371 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.760350 2572 manager.go:1196] Started watching for new ooms in manager Apr 28 19:15:58.761240 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.761224 2572 manager.go:319] Starting recovery of all containers Apr 28 19:15:58.762676 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.762649 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:15:58.762767 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.762747 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:15:58.763003 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.762971 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:15:58.764262 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.763234 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-36.ec2.internal.18aa9b4d2ed69dc6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-36.ec2.internal,UID:ip-10-0-134-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-36.ec2.internal,},FirstTimestamp:2026-04-28 19:15:58.751686086 +0000 UTC m=+0.424131181,LastTimestamp:2026-04-28 19:15:58.751686086 +0000 UTC m=+0.424131181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-36.ec2.internal,}" Apr 28 19:15:58.771549 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.771441 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w8vpz" Apr 28 19:15:58.772284 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.772265 2572 manager.go:324] Recovery completed Apr 28 19:15:58.776135 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.776124 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:58.778427 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.778411 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:58.778475 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.778444 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:58.778475 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.778457 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:58.778868 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.778855 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:15:58.778868 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.778867 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:15:58.778966 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.778883 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:15:58.781014 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.780948 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-36.ec2.internal.18aa9b4d306eaa19 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-36.ec2.internal,UID:ip-10-0-134-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-36.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-36.ec2.internal,},FirstTimestamp:2026-04-28 19:15:58.778427929 +0000 UTC m=+0.450873024,LastTimestamp:2026-04-28 19:15:58.778427929 +0000 UTC m=+0.450873024,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-36.ec2.internal,}" Apr 28 19:15:58.782013 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.781994 2572 policy_none.go:49] "None policy: Start" Apr 28 19:15:58.782013 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.782009 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:15:58.782113 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.782019 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:15:58.782695 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.782679 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w8vpz" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.824694 2572 manager.go:341] "Starting Device Plugin manager" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.824726 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.824738 2572 server.go:85] "Starting device plugin registration server" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.824939 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.824949 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.825042 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.825114 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.825121 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.825565 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:15:58.839705 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.825590 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:58.883205 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.883160 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:15:58.884349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.884332 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:15:58.884443 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.884361 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:15:58.884443 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.884380 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:15:58.884443 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.884387 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:15:58.884443 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.884420 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:15:58.887080 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.887065 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:58.925931 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.925885 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:58.926809 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.926795 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:58.926865 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.926824 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:58.926865 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.926836 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:58.926865 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.926864 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-36.ec2.internal" Apr 28 19:15:58.937650 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.937630 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-36.ec2.internal" Apr 28 19:15:58.937736 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.937653 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-36.ec2.internal\": node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:58.958672 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:58.958654 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:58.984704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.984684 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal"] Apr 28 19:15:58.984758 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.984750 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:58.985540 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.985524 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:58.985601 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.985555 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:58.985601 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.985565 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:58.987688 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.987676 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:58.987819 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.987806 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:58.987854 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.987831 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:58.988329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.988314 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:58.988395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.988342 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:58.988395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.988352 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:58.988395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.988315 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:58.988482 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.988407 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:58.988482 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.988417 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:58.990427 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.990413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" Apr 28 19:15:58.990502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.990435 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:58.991054 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.991041 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:58.991143 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.991061 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:58.991143 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:58.991069 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:59.010625 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.010609 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-36.ec2.internal\" not found" node="ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.014619 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.014604 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-36.ec2.internal\" not found" node="ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.059397 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.059371 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:59.061532 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.061518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce137b5b409dd1863deb46198ae6b20-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal\" (UID: \"0ce137b5b409dd1863deb46198ae6b20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.061592 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.061541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c871a044bef489e8809d7f6e127e2847-config\") pod \"kube-apiserver-proxy-ip-10-0-134-36.ec2.internal\" (UID: \"c871a044bef489e8809d7f6e127e2847\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.061592 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.061557 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0ce137b5b409dd1863deb46198ae6b20-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal\" (UID: \"0ce137b5b409dd1863deb46198ae6b20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.160310 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.160289 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:59.162473 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.162456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0ce137b5b409dd1863deb46198ae6b20-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal\" (UID: \"0ce137b5b409dd1863deb46198ae6b20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.162524 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.162496 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce137b5b409dd1863deb46198ae6b20-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal\" (UID: \"0ce137b5b409dd1863deb46198ae6b20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.162524 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.162515 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c871a044bef489e8809d7f6e127e2847-config\") pod \"kube-apiserver-proxy-ip-10-0-134-36.ec2.internal\" (UID: \"c871a044bef489e8809d7f6e127e2847\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.162617 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.162549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0ce137b5b409dd1863deb46198ae6b20-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal\" (UID: \"0ce137b5b409dd1863deb46198ae6b20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.162683 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.162617 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ce137b5b409dd1863deb46198ae6b20-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal\" (UID: \"0ce137b5b409dd1863deb46198ae6b20\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.162683 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.162620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c871a044bef489e8809d7f6e127e2847-config\") pod \"kube-apiserver-proxy-ip-10-0-134-36.ec2.internal\" (UID: \"c871a044bef489e8809d7f6e127e2847\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.261134 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.261111 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:59.312459 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.312438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.316950 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.316930 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.361794 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.361760 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:59.462304 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.462269 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:59.562766 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.562706 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:59.663212 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.663183 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-36.ec2.internal\" not found" Apr 28 19:15:59.676462 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.676448 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:15:59.676580 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.676565 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:15:59.737936 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.737908 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:59.740029 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.740016 2572 apiserver.go:52] "Watching apiserver" Apr 28 19:15:59.749277 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.749260 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:15:59.752418 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.752381 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-tp2kj","openshift-cluster-node-tuning-operator/tuned-8p5gv","openshift-dns/node-resolver-vzbxw","openshift-image-registry/node-ca-wtz8s","openshift-multus/multus-58mkl","openshift-multus/multus-additional-cni-plugins-hvlw8","openshift-multus/network-metrics-daemon-kbztx","openshift-network-operator/iptables-alerter-mqccp","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c","openshift-network-diagnostics/network-check-target-n6x95","openshift-ovn-kubernetes/ovnkube-node-p8r2c"] Apr 28 19:15:59.753496 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.753477 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:59.757607 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.757578 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.757846 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.757825 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:15:59.759047 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.759031 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.760004 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.759981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.760149 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.760129 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:15:59.761618 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.761598 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:15:59.761929 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.761913 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:15:59.761995 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.761939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-w6sdf\"" Apr 28 19:15:59.764181 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.764149 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.764264 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.764248 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.764788 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.764663 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:15:59.764788 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.764733 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:15:59.764788 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.764746 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:15:59.764788 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.764772 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:15:59.765110 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765091 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:15:59.765248 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765233 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:15:59.765324 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-swf9m\"" Apr 28 19:15:59.765324 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765302 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:15:59.765410 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765348 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9bdgz\"" Apr 28 19:15:59.765448 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/704ee9d6-f870-489e-8d81-6488fb22d8be-konnectivity-ca\") pod \"konnectivity-agent-tp2kj\" (UID: \"704ee9d6-f870-489e-8d81-6488fb22d8be\") " pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:15:59.765499 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ccc8edc1-f281-4f6c-b9d7-56c52685d934-hosts-file\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.765499 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-system-cni-dir\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.765578 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765512 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cnibin\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.765578 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/704ee9d6-f870-489e-8d81-6488fb22d8be-agent-certs\") pod \"konnectivity-agent-tp2kj\" (UID: \"704ee9d6-f870-489e-8d81-6488fb22d8be\") " pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:15:59.765578 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccc8edc1-f281-4f6c-b9d7-56c52685d934-tmp-dir\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.765578 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765565 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-os-release\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.765746 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765585 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.765746 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.765746 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxqf\" (UniqueName: \"kubernetes.io/projected/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-kube-api-access-gdxqf\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.765746 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cni-binary-copy\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.765746 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.765746 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.765705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zg26\" (UniqueName: \"kubernetes.io/projected/ccc8edc1-f281-4f6c-b9d7-56c52685d934-kube-api-access-4zg26\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.766495 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.766477 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.766588 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.766500 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:15:59.766651 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.766593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z7lvq\"" Apr 28 19:15:59.766995 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.766978 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:15:59.768704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.768573 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:15:59.768704 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.768637 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:15:59.768704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.768643 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:15:59.768904 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.768886 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:15:59.768963 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.768901 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:15:59.768963 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.768906 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kbrhp\"" Apr 28 19:15:59.769225 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.769212 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:15:59.769283 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.769234 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n7tb2\"" Apr 28 19:15:59.770983 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.770967 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.773157 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.773142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.773978 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.773963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hvftn\"" Apr 28 19:15:59.774051 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.773964 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:15:59.774051 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.774043 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" Apr 28 19:15:59.774464 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.774449 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:15:59.774529 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.774463 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:15:59.774612 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.774593 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:15:59.775352 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.775335 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:15:59.775429 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.775398 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:15:59.775641 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.775623 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:15:59.775858 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.775844 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:15:59.777419 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.777401 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:15:59.777620 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.777603 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bf2sf\"" Apr 28 19:15:59.777835 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.777823 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal"] Apr 28 19:15:59.777928 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.777919 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.779831 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.779812 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:15:59.780746 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.780727 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:15:59.780949 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.780928 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:15:59.781871 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.781760 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:15:59.782354 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.782301 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-97snv\"" Apr 28 19:15:59.782463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.782448 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:15:59.783117 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.783098 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:15:59.784909 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.783965 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:15:59.784909 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.783982 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:15:59.784909 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.784327 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal"] Apr 28 19:15:59.784909 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.784385 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:10:58 +0000 UTC" deadline="2027-12-24 22:28:12.00564255 +0000 UTC" Apr 28 19:15:59.784909 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.784403 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14523h12m12.221242743s" Apr 28 19:15:59.810116 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.810100 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zrxpb" Apr 28 19:15:59.819435 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.819397 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zrxpb" Apr 28 19:15:59.856324 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:59.856297 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ce137b5b409dd1863deb46198ae6b20.slice/crio-5fc3eec19776bcdd41e9ba8b19facd8e48f032c11eef112bdea7adcef2e1731c WatchSource:0}: Error finding container 5fc3eec19776bcdd41e9ba8b19facd8e48f032c11eef112bdea7adcef2e1731c: Status 404 returned error can't find the container with id 5fc3eec19776bcdd41e9ba8b19facd8e48f032c11eef112bdea7adcef2e1731c Apr 28 19:15:59.856536 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:15:59.856515 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc871a044bef489e8809d7f6e127e2847.slice/crio-0d0051900f45aeaf93960bbb572345860d0115123069646fe1cf5f752a4825a0 WatchSource:0}: Error finding container 0d0051900f45aeaf93960bbb572345860d0115123069646fe1cf5f752a4825a0: Status 404 returned error can't find the container with id 0d0051900f45aeaf93960bbb572345860d0115123069646fe1cf5f752a4825a0 Apr 28 19:15:59.859865 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.859851 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:15:59.860263 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.860243 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:15:59.865868 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.865847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-cni-binary-copy\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.865956 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.865894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-etc-kubernetes\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.865956 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.865924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvss4\" (UniqueName: \"kubernetes.io/projected/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-kube-api-access-hvss4\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.866056 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.865958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/704ee9d6-f870-489e-8d81-6488fb22d8be-konnectivity-ca\") pod \"konnectivity-agent-tp2kj\" (UID: \"704ee9d6-f870-489e-8d81-6488fb22d8be\") " pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:15:59.866056 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.865979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b959573c-d823-4400-ac66-5f111c6ec711-host\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.866056 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.865999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-cnibin\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.866223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-run-netns\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-systemd\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-system-cni-dir\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.866502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/704ee9d6-f870-489e-8d81-6488fb22d8be-agent-certs\") pod \"konnectivity-agent-tp2kj\" (UID: \"704ee9d6-f870-489e-8d81-6488fb22d8be\") " pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:15:59.866502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-systemd\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.866502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-system-cni-dir\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.866502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-cni-multus\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.866502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.866502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-etc-selinux\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.866502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-log-socket\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/704ee9d6-f870-489e-8d81-6488fb22d8be-konnectivity-ca\") pod \"konnectivity-agent-tp2kj\" (UID: \"704ee9d6-f870-489e-8d81-6488fb22d8be\") " pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovnkube-script-lib\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnqtn\" (UniqueName: \"kubernetes.io/projected/971a7028-d715-4000-af2d-1ba9ff023dae-kube-api-access-pnqtn\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866582 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-k8s-cni-cncf-io\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866582 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovn-node-metrics-cert\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-var-lib-kubelet\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-host\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866716 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-kubelet\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-multus-certs\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-etc-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovnkube-config\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.866833 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866824 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-lib-modules\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b959573c-d823-4400-ac66-5f111c6ec711-serviceca\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ccc8edc1-f281-4f6c-b9d7-56c52685d934-hosts-file\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cnibin\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysconfig\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.866984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5gtk\" (UniqueName: \"kubernetes.io/projected/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-kube-api-access-f5gtk\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ccc8edc1-f281-4f6c-b9d7-56c52685d934-hosts-file\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867023 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5cxj\" (UniqueName: \"kubernetes.io/projected/b959573c-d823-4400-ac66-5f111c6ec711-kube-api-access-j5cxj\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867049 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qkmr\" (UniqueName: \"kubernetes.io/projected/a341bf63-a680-4dba-8ba9-7f2a8180d537-kube-api-access-9qkmr\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/971a7028-d715-4000-af2d-1ba9ff023dae-iptables-alerter-script\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-os-release\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-os-release\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867298 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/971a7028-d715-4000-af2d-1ba9ff023dae-host-slash\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-cni-bin\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.867349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-slash\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-ovn\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cni-binary-copy\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867434 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-kubernetes\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-run\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867481 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-cni-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867504 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-socket-dir-parent\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867532 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-hostroot\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cnibin\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867554 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-systemd-units\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867578 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-env-overrides\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zg26\" (UniqueName: \"kubernetes.io/projected/ccc8edc1-f281-4f6c-b9d7-56c52685d934-kube-api-access-4zg26\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867631 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-var-lib-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867657 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-cni-netd\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867683 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-netns\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-conf-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxqf\" (UniqueName: \"kubernetes.io/projected/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-kube-api-access-gdxqf\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.868138 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867740 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysctl-conf\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867787 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w6zf\" (UniqueName: \"kubernetes.io/projected/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-kube-api-access-2w6zf\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-tuned\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-cni-binary-copy\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.867911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-tmp\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-daemon-config\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-device-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-modprobe-d\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-os-release\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-socket-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-registration-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-sys-fs\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmzxp\" (UniqueName: \"kubernetes.io/projected/5c6c3d11-3969-4ce1-9787-d17dc85f7449-kube-api-access-nmzxp\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-cni-bin\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.868861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-kubelet\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-node-log\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868367 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccc8edc1-f281-4f6c-b9d7-56c52685d934-tmp-dir\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-system-cni-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysctl-d\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-sys\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868523 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.869300 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.868635 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccc8edc1-f281-4f6c-b9d7-56c52685d934-tmp-dir\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.870228 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.870210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/704ee9d6-f870-489e-8d81-6488fb22d8be-agent-certs\") pod \"konnectivity-agent-tp2kj\" (UID: \"704ee9d6-f870-489e-8d81-6488fb22d8be\") " pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:15:59.882349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.882328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zg26\" (UniqueName: \"kubernetes.io/projected/ccc8edc1-f281-4f6c-b9d7-56c52685d934-kube-api-access-4zg26\") pod \"node-resolver-vzbxw\" (UID: \"ccc8edc1-f281-4f6c-b9d7-56c52685d934\") " pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:15:59.883588 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.883573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxqf\" (UniqueName: \"kubernetes.io/projected/33cdcd08-3e9e-4229-b74c-2e99bdeb2074-kube-api-access-gdxqf\") pod \"multus-additional-cni-plugins-hvlw8\" (UID: \"33cdcd08-3e9e-4229-b74c-2e99bdeb2074\") " pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:15:59.887006 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.886972 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" event={"ID":"c871a044bef489e8809d7f6e127e2847","Type":"ContainerStarted","Data":"0d0051900f45aeaf93960bbb572345860d0115123069646fe1cf5f752a4825a0"} Apr 28 19:15:59.888051 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.888023 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" event={"ID":"0ce137b5b409dd1863deb46198ae6b20","Type":"ContainerStarted","Data":"5fc3eec19776bcdd41e9ba8b19facd8e48f032c11eef112bdea7adcef2e1731c"} Apr 28 19:15:59.969349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-host\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.969349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-kubelet\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.969471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-multus-certs\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.969471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-etc-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.969471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovnkube-config\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.969471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-lib-modules\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.969471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-kubelet\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.969471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b959573c-d823-4400-ac66-5f111c6ec711-serviceca\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.969471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-multus-certs\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.969471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-host\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysconfig\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-lib-modules\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5gtk\" (UniqueName: \"kubernetes.io/projected/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-kube-api-access-f5gtk\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-etc-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5cxj\" (UniqueName: \"kubernetes.io/projected/b959573c-d823-4400-ac66-5f111c6ec711-kube-api-access-j5cxj\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qkmr\" (UniqueName: \"kubernetes.io/projected/a341bf63-a680-4dba-8ba9-7f2a8180d537-kube-api-access-9qkmr\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/971a7028-d715-4000-af2d-1ba9ff023dae-iptables-alerter-script\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/971a7028-d715-4000-af2d-1ba9ff023dae-host-slash\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-cni-bin\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-slash\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/971a7028-d715-4000-af2d-1ba9ff023dae-host-slash\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-ovn\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-kubernetes\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-cni-bin\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.969830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-run\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-cni-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969855 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-slash\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b959573c-d823-4400-ac66-5f111c6ec711-serviceca\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-socket-dir-parent\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-kubernetes\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-hostroot\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-run\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969936 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-systemd-units\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-socket-dir-parent\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-cni-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-env-overrides\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-ovn\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysconfig\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-var-lib-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-cni-netd\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-netns\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovnkube-config\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.970522 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.969996 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-systemd-units\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970076 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-conf-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-var-lib-openvswitch\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970095 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-hostroot\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysctl-conf\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-cni-netd\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w6zf\" (UniqueName: \"kubernetes.io/projected/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-kube-api-access-2w6zf\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-netns\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-tuned\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-tmp\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysctl-conf\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-conf-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-daemon-config\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-device-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-modprobe-d\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-os-release\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-socket-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-registration-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970509 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-device-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-registration-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-modprobe-d\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970624 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-socket-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-sys-fs\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/971a7028-d715-4000-af2d-1ba9ff023dae-iptables-alerter-script\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmzxp\" (UniqueName: \"kubernetes.io/projected/5c6c3d11-3969-4ce1-9787-d17dc85f7449-kube-api-access-nmzxp\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-cni-bin\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970718 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-sys-fs\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-cni-bin\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-os-release\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-env-overrides\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-kubelet\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-multus-daemon-config\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-kubelet\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970885 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-node-log\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.971787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-node-log\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-system-cni-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.970928 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysctl-d\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970972 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-sys\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-system-cni-dir\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.970998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-sys\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.971007 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:00.470973802 +0000 UTC m=+2.143418898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-cni-binary-copy\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-etc-kubernetes\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-sysctl-d\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvss4\" (UniqueName: \"kubernetes.io/projected/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-kube-api-access-hvss4\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-etc-kubernetes\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b959573c-d823-4400-ac66-5f111c6ec711-host\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-cnibin\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971197 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-run-netns\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b959573c-d823-4400-ac66-5f111c6ec711-host\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-systemd\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972329 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971264 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-cnibin\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-systemd\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971327 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-cni-multus\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971335 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-run-netns\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971305 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-run-systemd\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-systemd\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-var-lib-cni-multus\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-etc-selinux\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-log-socket\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovnkube-script-lib\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnqtn\" (UniqueName: \"kubernetes.io/projected/971a7028-d715-4000-af2d-1ba9ff023dae-kube-api-access-pnqtn\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-log-socket\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.972884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971514 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-k8s-cni-cncf-io\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-cni-binary-copy\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971563 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovn-node-metrics-cert\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-var-lib-kubelet\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-host-run-k8s-cni-cncf-io\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971565 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c6c3d11-3969-4ce1-9787-d17dc85f7449-etc-selinux\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971652 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-var-lib-kubelet\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.971972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovnkube-script-lib\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.972373 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-tmp\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.972507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-etc-tuned\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.973395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.973329 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-ovn-node-metrics-cert\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.977767 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.977748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5cxj\" (UniqueName: \"kubernetes.io/projected/b959573c-d823-4400-ac66-5f111c6ec711-kube-api-access-j5cxj\") pod \"node-ca-wtz8s\" (UID: \"b959573c-d823-4400-ac66-5f111c6ec711\") " pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:15:59.978735 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.978712 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5gtk\" (UniqueName: \"kubernetes.io/projected/e10b87c8-9253-42f3-a4fb-0a8212d8fd26-kube-api-access-f5gtk\") pod \"tuned-8p5gv\" (UID: \"e10b87c8-9253-42f3-a4fb-0a8212d8fd26\") " pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:15:59.982282 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.982258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qkmr\" (UniqueName: \"kubernetes.io/projected/a341bf63-a680-4dba-8ba9-7f2a8180d537-kube-api-access-9qkmr\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:15:59.982353 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.982293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmzxp\" (UniqueName: \"kubernetes.io/projected/5c6c3d11-3969-4ce1-9787-d17dc85f7449-kube-api-access-nmzxp\") pod \"aws-ebs-csi-driver-node-4g49c\" (UID: \"5c6c3d11-3969-4ce1-9787-d17dc85f7449\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:15:59.982895 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.982876 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w6zf\" (UniqueName: \"kubernetes.io/projected/88d48ce5-a443-48e0-b53f-8ffb2088ab5f-kube-api-access-2w6zf\") pod \"ovnkube-node-p8r2c\" (UID: \"88d48ce5-a443-48e0-b53f-8ffb2088ab5f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:15:59.984206 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.984193 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:59.984249 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.984208 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:59.984249 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.984217 2572 projected.go:194] Error preparing data for projected volume kube-api-access-55762 for pod openshift-network-diagnostics/network-check-target-n6x95: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:59.984312 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:15:59.984266 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762 podName:29413dbb-70ef-4e06-8580-98f854320cbb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:00.484252588 +0000 UTC m=+2.156697669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-55762" (UniqueName: "kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762") pod "network-check-target-n6x95" (UID: "29413dbb-70ef-4e06-8580-98f854320cbb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:59.987932 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.987917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvss4\" (UniqueName: \"kubernetes.io/projected/5369f9de-bd9f-4ef6-877b-ef3932a99bd9-kube-api-access-hvss4\") pod \"multus-58mkl\" (UID: \"5369f9de-bd9f-4ef6-877b-ef3932a99bd9\") " pod="openshift-multus/multus-58mkl" Apr 28 19:15:59.990453 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:15:59.990436 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnqtn\" (UniqueName: \"kubernetes.io/projected/971a7028-d715-4000-af2d-1ba9ff023dae-kube-api-access-pnqtn\") pod \"iptables-alerter-mqccp\" (UID: \"971a7028-d715-4000-af2d-1ba9ff023dae\") " pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:16:00.080921 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.080848 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vzbxw" Apr 28 19:16:00.086467 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.086450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" Apr 28 19:16:00.087055 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.087031 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccc8edc1_f281_4f6c_b9d7_56c52685d934.slice/crio-b2e7cdec2f519285afe70749b392155bdaa5fe39dab208cd48a110b89269c394 WatchSource:0}: Error finding container b2e7cdec2f519285afe70749b392155bdaa5fe39dab208cd48a110b89269c394: Status 404 returned error can't find the container with id b2e7cdec2f519285afe70749b392155bdaa5fe39dab208cd48a110b89269c394 Apr 28 19:16:00.092921 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.092900 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33cdcd08_3e9e_4229_b74c_2e99bdeb2074.slice/crio-93c7e5a44979e6533b1fb5547aaf5839beb0e17145e5c2fb68f676b8c878615a WatchSource:0}: Error finding container 93c7e5a44979e6533b1fb5547aaf5839beb0e17145e5c2fb68f676b8c878615a: Status 404 returned error can't find the container with id 93c7e5a44979e6533b1fb5547aaf5839beb0e17145e5c2fb68f676b8c878615a Apr 28 19:16:00.104600 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.104586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:16:00.110571 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.110553 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod704ee9d6_f870_489e_8d81_6488fb22d8be.slice/crio-3e3a621a9255badd5878ba0fe35a79b38eb47c02d45e1653513cbf25de3ab580 WatchSource:0}: Error finding container 3e3a621a9255badd5878ba0fe35a79b38eb47c02d45e1653513cbf25de3ab580: Status 404 returned error can't find the container with id 3e3a621a9255badd5878ba0fe35a79b38eb47c02d45e1653513cbf25de3ab580 Apr 28 19:16:00.122448 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.122431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" Apr 28 19:16:00.127615 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.127601 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wtz8s" Apr 28 19:16:00.129262 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.129240 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode10b87c8_9253_42f3_a4fb_0a8212d8fd26.slice/crio-2ff916c5ca661e80bb228e96cce9c7b5ac5962927f2dc3429b9a853461ae1ead WatchSource:0}: Error finding container 2ff916c5ca661e80bb228e96cce9c7b5ac5962927f2dc3429b9a853461ae1ead: Status 404 returned error can't find the container with id 2ff916c5ca661e80bb228e96cce9c7b5ac5962927f2dc3429b9a853461ae1ead Apr 28 19:16:00.133350 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.133330 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb959573c_d823_4400_ac66_5f111c6ec711.slice/crio-7f14a0b9375bf07ac82643af120e215b3a05ed5bd5dd36a4422e23a6dc369378 WatchSource:0}: Error finding container 7f14a0b9375bf07ac82643af120e215b3a05ed5bd5dd36a4422e23a6dc369378: Status 404 returned error can't find the container with id 7f14a0b9375bf07ac82643af120e215b3a05ed5bd5dd36a4422e23a6dc369378 Apr 28 19:16:00.142997 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.142983 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-58mkl" Apr 28 19:16:00.148639 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.148619 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5369f9de_bd9f_4ef6_877b_ef3932a99bd9.slice/crio-232a626465b1c33dcfaaf5d20da1b52f3276e3690f4c0d84eed004e34ca8671f WatchSource:0}: Error finding container 232a626465b1c33dcfaaf5d20da1b52f3276e3690f4c0d84eed004e34ca8671f: Status 404 returned error can't find the container with id 232a626465b1c33dcfaaf5d20da1b52f3276e3690f4c0d84eed004e34ca8671f Apr 28 19:16:00.164062 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.164040 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mqccp" Apr 28 19:16:00.170796 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.170777 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" Apr 28 19:16:00.173022 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.173001 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod971a7028_d715_4000_af2d_1ba9ff023dae.slice/crio-42b132f51fa37f7d887ded51fd80e773ade9e0cab6fbe62f55a20c14845f21d0 WatchSource:0}: Error finding container 42b132f51fa37f7d887ded51fd80e773ade9e0cab6fbe62f55a20c14845f21d0: Status 404 returned error can't find the container with id 42b132f51fa37f7d887ded51fd80e773ade9e0cab6fbe62f55a20c14845f21d0 Apr 28 19:16:00.177453 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.177434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:16:00.177585 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.177568 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c6c3d11_3969_4ce1_9787_d17dc85f7449.slice/crio-ea2a05c5c5d1d2339b1f8ac2d4624be6115c117cbf83832aa062a3a80b068ce2 WatchSource:0}: Error finding container ea2a05c5c5d1d2339b1f8ac2d4624be6115c117cbf83832aa062a3a80b068ce2: Status 404 returned error can't find the container with id ea2a05c5c5d1d2339b1f8ac2d4624be6115c117cbf83832aa062a3a80b068ce2 Apr 28 19:16:00.183585 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:00.183557 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88d48ce5_a443_48e0_b53f_8ffb2088ab5f.slice/crio-f4f7bbbf99c19275388a3048e6a38e1731e484ec9a26139477497732f5c8c5aa WatchSource:0}: Error finding container f4f7bbbf99c19275388a3048e6a38e1731e484ec9a26139477497732f5c8c5aa: Status 404 returned error can't find the container with id f4f7bbbf99c19275388a3048e6a38e1731e484ec9a26139477497732f5c8c5aa Apr 28 19:16:00.263264 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.263239 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:00.474589 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.474557 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:00.474810 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:00.474733 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:00.474810 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:00.474795 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:01.474777455 +0000 UTC m=+3.147222552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:00.576059 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.575808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:00.576059 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:00.575976 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:00.576059 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:00.576001 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:00.576059 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:00.576017 2572 projected.go:194] Error preparing data for projected volume kube-api-access-55762 for pod openshift-network-diagnostics/network-check-target-n6x95: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:00.576507 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:00.576077 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762 podName:29413dbb-70ef-4e06-8580-98f854320cbb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:01.576055886 +0000 UTC m=+3.248500974 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-55762" (UniqueName: "kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762") pod "network-check-target-n6x95" (UID: "29413dbb-70ef-4e06-8580-98f854320cbb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:00.821067 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.820936 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:10:59 +0000 UTC" deadline="2027-11-28 10:50:54.661827963 +0000 UTC" Apr 28 19:16:00.821067 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.820971 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13887h34m53.840861554s" Apr 28 19:16:00.832755 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.832544 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:16:00.916681 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.916632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"f4f7bbbf99c19275388a3048e6a38e1731e484ec9a26139477497732f5c8c5aa"} Apr 28 19:16:00.924593 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.924559 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" event={"ID":"e10b87c8-9253-42f3-a4fb-0a8212d8fd26","Type":"ContainerStarted","Data":"2ff916c5ca661e80bb228e96cce9c7b5ac5962927f2dc3429b9a853461ae1ead"} Apr 28 19:16:00.933481 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.933439 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tp2kj" event={"ID":"704ee9d6-f870-489e-8d81-6488fb22d8be","Type":"ContainerStarted","Data":"3e3a621a9255badd5878ba0fe35a79b38eb47c02d45e1653513cbf25de3ab580"} Apr 28 19:16:00.965202 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.965085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vzbxw" event={"ID":"ccc8edc1-f281-4f6c-b9d7-56c52685d934","Type":"ContainerStarted","Data":"b2e7cdec2f519285afe70749b392155bdaa5fe39dab208cd48a110b89269c394"} Apr 28 19:16:00.987477 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:00.987448 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" event={"ID":"5c6c3d11-3969-4ce1-9787-d17dc85f7449","Type":"ContainerStarted","Data":"ea2a05c5c5d1d2339b1f8ac2d4624be6115c117cbf83832aa062a3a80b068ce2"} Apr 28 19:16:01.013060 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.012988 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mqccp" event={"ID":"971a7028-d715-4000-af2d-1ba9ff023dae","Type":"ContainerStarted","Data":"42b132f51fa37f7d887ded51fd80e773ade9e0cab6fbe62f55a20c14845f21d0"} Apr 28 19:16:01.032831 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.032804 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58mkl" event={"ID":"5369f9de-bd9f-4ef6-877b-ef3932a99bd9","Type":"ContainerStarted","Data":"232a626465b1c33dcfaaf5d20da1b52f3276e3690f4c0d84eed004e34ca8671f"} Apr 28 19:16:01.060145 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.060057 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wtz8s" event={"ID":"b959573c-d823-4400-ac66-5f111c6ec711","Type":"ContainerStarted","Data":"7f14a0b9375bf07ac82643af120e215b3a05ed5bd5dd36a4422e23a6dc369378"} Apr 28 19:16:01.085991 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.085896 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" event={"ID":"33cdcd08-3e9e-4229-b74c-2e99bdeb2074","Type":"ContainerStarted","Data":"93c7e5a44979e6533b1fb5547aaf5839beb0e17145e5c2fb68f676b8c878615a"} Apr 28 19:16:01.482585 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.482544 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:01.496357 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:01.496316 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:01.496504 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:01.496415 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:03.496393944 +0000 UTC m=+5.168839031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:01.583731 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.583663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:01.583895 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:01.583833 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:01.583895 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:01.583854 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:01.583895 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:01.583867 2572 projected.go:194] Error preparing data for projected volume kube-api-access-55762 for pod openshift-network-diagnostics/network-check-target-n6x95: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:01.584049 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:01.583920 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762 podName:29413dbb-70ef-4e06-8580-98f854320cbb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:03.583901765 +0000 UTC m=+5.256346871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-55762" (UniqueName: "kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762") pod "network-check-target-n6x95" (UID: "29413dbb-70ef-4e06-8580-98f854320cbb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:01.821918 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.821832 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:10:59 +0000 UTC" deadline="2027-09-22 05:24:34.360201582 +0000 UTC" Apr 28 19:16:01.821918 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.821872 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12274h8m32.538333506s" Apr 28 19:16:01.885958 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.885017 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:01.885958 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:01.885136 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:01.885958 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:01.885571 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:01.885958 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:01.885670 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:03.500295 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:03.500260 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:03.500750 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:03.500429 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:03.500750 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:03.500487 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:07.500468619 +0000 UTC m=+9.172913706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:03.601086 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:03.601018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:03.601280 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:03.601190 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:03.601280 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:03.601209 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:03.601280 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:03.601220 2572 projected.go:194] Error preparing data for projected volume kube-api-access-55762 for pod openshift-network-diagnostics/network-check-target-n6x95: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:03.601280 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:03.601269 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762 podName:29413dbb-70ef-4e06-8580-98f854320cbb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:07.601253451 +0000 UTC m=+9.273698546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-55762" (UniqueName: "kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762") pod "network-check-target-n6x95" (UID: "29413dbb-70ef-4e06-8580-98f854320cbb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:03.885484 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:03.884765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:03.885484 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:03.884903 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:03.885484 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:03.884972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:03.885484 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:03.885037 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:05.885376 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:05.885340 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:05.885376 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:05.885366 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:05.885870 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:05.885494 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:05.885870 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:05.885632 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:07.529962 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:07.529873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:07.530416 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:07.530003 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:07.530416 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:07.530070 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:15.530048606 +0000 UTC m=+17.202493693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:07.631073 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:07.631034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:07.631263 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:07.631226 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:07.631263 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:07.631249 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:07.631263 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:07.631259 2572 projected.go:194] Error preparing data for projected volume kube-api-access-55762 for pod openshift-network-diagnostics/network-check-target-n6x95: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:07.631430 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:07.631313 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762 podName:29413dbb-70ef-4e06-8580-98f854320cbb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:15.631293636 +0000 UTC m=+17.303738719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-55762" (UniqueName: "kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762") pod "network-check-target-n6x95" (UID: "29413dbb-70ef-4e06-8580-98f854320cbb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:07.884733 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:07.884622 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:07.884897 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:07.884741 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:07.885125 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:07.884638 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:07.885125 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:07.885086 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:09.885184 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:09.885137 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:09.885629 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:09.885152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:09.885629 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:09.885254 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:09.885629 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:09.885371 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:11.884852 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:11.884809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:11.885358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:11.884814 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:11.885358 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:11.884924 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:11.885358 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:11.885006 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:13.885533 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:13.885458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:13.885916 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:13.885458 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:13.885916 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:13.885606 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:13.885916 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:13.885666 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:15.583804 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:15.583768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:15.584261 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:15.583939 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:15.584261 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:15.584017 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.583995672 +0000 UTC m=+33.256440768 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:15.684739 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:15.684701 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:15.684932 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:15.684880 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:15.684932 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:15.684903 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:15.684932 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:15.684914 2572 projected.go:194] Error preparing data for projected volume kube-api-access-55762 for pod openshift-network-diagnostics/network-check-target-n6x95: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:15.685095 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:15.684971 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762 podName:29413dbb-70ef-4e06-8580-98f854320cbb nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.68495583 +0000 UTC m=+33.357400914 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-55762" (UniqueName: "kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762") pod "network-check-target-n6x95" (UID: "29413dbb-70ef-4e06-8580-98f854320cbb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:15.884807 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:15.884732 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:15.884975 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:15.884742 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:15.884975 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:15.884855 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:15.884975 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:15.884959 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:17.885372 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:17.885216 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:17.885809 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:17.885285 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:17.885809 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:17.885491 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:17.885809 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:17.885554 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:18.123509 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.123284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" event={"ID":"c871a044bef489e8809d7f6e127e2847","Type":"ContainerStarted","Data":"58e0ed2982ea64b5d5a7d3e2126ba8d52df6eb3ac9db9b0fc786f0734bf4cd09"} Apr 28 19:16:18.133106 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.133064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-58mkl" event={"ID":"5369f9de-bd9f-4ef6-877b-ef3932a99bd9","Type":"ContainerStarted","Data":"65eec6343a516150e10f816d527fbeaf42c2d67d98d394b386529a3b80444583"} Apr 28 19:16:18.140597 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.140133 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-36.ec2.internal" podStartSLOduration=19.140112059 podStartE2EDuration="19.140112059s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:18.138995678 +0000 UTC m=+19.811440804" watchObservedRunningTime="2026-04-28 19:16:18.140112059 +0000 UTC m=+19.812557204" Apr 28 19:16:18.143325 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.143289 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"7522cae4137245c19b65f0985511a0723a5baea4ef493f4bb60eb62dd0bfb1ae"} Apr 28 19:16:18.143468 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.143341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"7e408ca562a5a24984b10087207176830c5f2a4c0f5c91dd98a0b05d41cb0a6e"} Apr 28 19:16:18.143468 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.143355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"69c02b3671c17aee3aa598e7ee5b90bdc2073d1a5903096c040d80545577a72e"} Apr 28 19:16:18.143468 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.143368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"e4843d4308f216be607c972f8caa095a57256033d04888c3493f4aa446cfdfab"} Apr 28 19:16:18.149752 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.149716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" event={"ID":"e10b87c8-9253-42f3-a4fb-0a8212d8fd26","Type":"ContainerStarted","Data":"49c0df84bcb56dc83fcffb8478df2b1f8e97e8d19adaa5e9b2c444d5835879fc"} Apr 28 19:16:18.180768 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.180723 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8p5gv" podStartSLOduration=1.857076816 podStartE2EDuration="19.180706399s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.130623587 +0000 UTC m=+1.803068669" lastFinishedPulling="2026-04-28 19:16:17.454253164 +0000 UTC m=+19.126698252" observedRunningTime="2026-04-28 19:16:18.180210899 +0000 UTC m=+19.852656005" watchObservedRunningTime="2026-04-28 19:16:18.180706399 +0000 UTC m=+19.853151503" Apr 28 19:16:18.180950 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:18.180928 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-58mkl" podStartSLOduration=1.8400913719999998 podStartE2EDuration="19.180921781s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.149883772 +0000 UTC m=+1.822328854" lastFinishedPulling="2026-04-28 19:16:17.490714169 +0000 UTC m=+19.163159263" observedRunningTime="2026-04-28 19:16:18.160447015 +0000 UTC m=+19.832892120" watchObservedRunningTime="2026-04-28 19:16:18.180921781 +0000 UTC m=+19.853366884" Apr 28 19:16:19.153733 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.153520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" event={"ID":"5c6c3d11-3969-4ce1-9787-d17dc85f7449","Type":"ContainerStarted","Data":"f66797af15d867d37781fcd691144f71d3f77343bfa4ad7cde34df03b51eba59"} Apr 28 19:16:19.155248 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.155219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mqccp" event={"ID":"971a7028-d715-4000-af2d-1ba9ff023dae","Type":"ContainerStarted","Data":"14a3d340ab777b607bab616f6752e463185cc14a0c0bc1c06d511b0faea6ff57"} Apr 28 19:16:19.156681 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.156654 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wtz8s" event={"ID":"b959573c-d823-4400-ac66-5f111c6ec711","Type":"ContainerStarted","Data":"618f774155be0aa8bd31f09554d22f4a82d325c8cdd2fab3382d4bf863f3674f"} Apr 28 19:16:19.158137 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.158107 2572 generic.go:358] "Generic (PLEG): container finished" podID="33cdcd08-3e9e-4229-b74c-2e99bdeb2074" containerID="268758b4ac296aad4548f8e0b2cdd7c12e3d65cb5f86e2dcdaaf4d7be17bc2f3" exitCode=0 Apr 28 19:16:19.158235 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.158203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" event={"ID":"33cdcd08-3e9e-4229-b74c-2e99bdeb2074","Type":"ContainerDied","Data":"268758b4ac296aad4548f8e0b2cdd7c12e3d65cb5f86e2dcdaaf4d7be17bc2f3"} Apr 28 19:16:19.159657 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.159613 2572 generic.go:358] "Generic (PLEG): container finished" podID="0ce137b5b409dd1863deb46198ae6b20" containerID="a6a073425f5778b484b1546d44415555f2fac5b9ea9c3bea658a5b3af59cfc54" exitCode=0 Apr 28 19:16:19.159728 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.159688 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" event={"ID":"0ce137b5b409dd1863deb46198ae6b20","Type":"ContainerDied","Data":"a6a073425f5778b484b1546d44415555f2fac5b9ea9c3bea658a5b3af59cfc54"} Apr 28 19:16:19.164226 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.164201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"f85366273f81848b38f19e654b47787fe8a777e395faa993a17823627275207e"} Apr 28 19:16:19.164321 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.164231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"6e68b09dc570020cdc967f8e894919cdc8dc603e3d4c20a61507581afb640371"} Apr 28 19:16:19.166261 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.166238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tp2kj" event={"ID":"704ee9d6-f870-489e-8d81-6488fb22d8be","Type":"ContainerStarted","Data":"58af360dfee480cfcf260a12af6c9ac0eb182483a4d579aa9bdd814a19be1a85"} Apr 28 19:16:19.167776 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.167753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vzbxw" event={"ID":"ccc8edc1-f281-4f6c-b9d7-56c52685d934","Type":"ContainerStarted","Data":"38c79e162f3c4cd8c75c4553110cccc3e62444bf12c1e7469953edd7b318d93f"} Apr 28 19:16:19.172876 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.172832 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mqccp" podStartSLOduration=2.892525245 podStartE2EDuration="20.172811173s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.174515921 +0000 UTC m=+1.846961007" lastFinishedPulling="2026-04-28 19:16:17.454801842 +0000 UTC m=+19.127246935" observedRunningTime="2026-04-28 19:16:19.172559649 +0000 UTC m=+20.845004754" watchObservedRunningTime="2026-04-28 19:16:19.172811173 +0000 UTC m=+20.845256285" Apr 28 19:16:19.211120 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.210550 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vzbxw" podStartSLOduration=3.879837468 podStartE2EDuration="21.210534077s" podCreationTimestamp="2026-04-28 19:15:58 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.088431339 +0000 UTC m=+1.760876420" lastFinishedPulling="2026-04-28 19:16:17.419127943 +0000 UTC m=+19.091573029" observedRunningTime="2026-04-28 19:16:19.210084638 +0000 UTC m=+20.882529765" watchObservedRunningTime="2026-04-28 19:16:19.210534077 +0000 UTC m=+20.882979183" Apr 28 19:16:19.257120 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.257069 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wtz8s" podStartSLOduration=3.022149989 podStartE2EDuration="20.257051691s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.134528616 +0000 UTC m=+1.806973696" lastFinishedPulling="2026-04-28 19:16:17.369430299 +0000 UTC m=+19.041875398" observedRunningTime="2026-04-28 19:16:19.25701777 +0000 UTC m=+20.929462875" watchObservedRunningTime="2026-04-28 19:16:19.257051691 +0000 UTC m=+20.929496795" Apr 28 19:16:19.356129 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.356102 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:19.837696 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.837545 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:19.356121411Z","UUID":"1c8603a8-d000-4e66-8c18-c9d8e5c1c42c","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:19.840528 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.840492 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:19.840528 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.840524 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:19.885639 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.885609 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:19.885829 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:19.885653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:19.885829 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:19.885759 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:19.885951 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:19.885866 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:20.172145 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:20.172066 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" event={"ID":"0ce137b5b409dd1863deb46198ae6b20","Type":"ContainerStarted","Data":"1244e4fef9ea3b325be25fb4c0a475a581a776bc1558c8f050a827ae9750f362"} Apr 28 19:16:20.173955 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:20.173916 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" event={"ID":"5c6c3d11-3969-4ce1-9787-d17dc85f7449","Type":"ContainerStarted","Data":"fe0e8de42f8d7cd4967d158cdda1da3bebfde935dcf00c33aa974d9330647d14"} Apr 28 19:16:20.188628 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:20.188584 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tp2kj" podStartSLOduration=3.84701177 podStartE2EDuration="21.188569095s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.111980735 +0000 UTC m=+1.784425820" lastFinishedPulling="2026-04-28 19:16:17.453538044 +0000 UTC m=+19.125983145" observedRunningTime="2026-04-28 19:16:19.277428038 +0000 UTC m=+20.949873169" watchObservedRunningTime="2026-04-28 19:16:20.188569095 +0000 UTC m=+21.861014197" Apr 28 19:16:20.189446 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:20.189416 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-36.ec2.internal" podStartSLOduration=21.189401909 podStartE2EDuration="21.189401909s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:20.188703613 +0000 UTC m=+21.861148713" watchObservedRunningTime="2026-04-28 19:16:20.189401909 +0000 UTC m=+21.861847013" Apr 28 19:16:20.542891 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:20.542845 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:16:20.543763 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:20.543745 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:16:21.178468 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:21.178431 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"21cbd9be097d18a0b6c7318d48890c1af7df167f3a8ab33d1dec3071dfe5976c"} Apr 28 19:16:21.180432 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:21.180403 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" event={"ID":"5c6c3d11-3969-4ce1-9787-d17dc85f7449","Type":"ContainerStarted","Data":"1051e7cca6e7d804dad69967e02001d93e6e5def76d4310bf889802fcb78a10f"} Apr 28 19:16:21.180871 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:21.180809 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:16:21.181141 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:21.181122 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tp2kj" Apr 28 19:16:21.231710 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:21.231660 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4g49c" podStartSLOduration=2.228716136 podStartE2EDuration="22.231644113s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.180778099 +0000 UTC m=+1.853223180" lastFinishedPulling="2026-04-28 19:16:20.183706063 +0000 UTC m=+21.856151157" observedRunningTime="2026-04-28 19:16:21.206459812 +0000 UTC m=+22.878904915" watchObservedRunningTime="2026-04-28 19:16:21.231644113 +0000 UTC m=+22.904089217" Apr 28 19:16:21.885471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:21.885432 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:21.885657 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:21.885478 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:21.885657 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:21.885567 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:21.885785 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:21.885734 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:23.188489 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:23.188158 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" event={"ID":"88d48ce5-a443-48e0-b53f-8ffb2088ab5f","Type":"ContainerStarted","Data":"204727ea00d3e994178acbbffb9aa11b94a892dbe05fddb6ee38819e04b7328d"} Apr 28 19:16:23.188847 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:23.188617 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:16:23.205524 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:23.205465 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:16:23.223082 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:23.223034 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" podStartSLOduration=6.735131116 podStartE2EDuration="24.223021108s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.185237109 +0000 UTC m=+1.857682191" lastFinishedPulling="2026-04-28 19:16:17.673127087 +0000 UTC m=+19.345572183" observedRunningTime="2026-04-28 19:16:23.222726789 +0000 UTC m=+24.895171892" watchObservedRunningTime="2026-04-28 19:16:23.223021108 +0000 UTC m=+24.895466212" Apr 28 19:16:23.884796 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:23.884759 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:23.884973 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:23.884766 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:23.884973 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:23.884875 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:23.884973 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:23.884945 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:24.192020 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:24.191940 2572 generic.go:358] "Generic (PLEG): container finished" podID="33cdcd08-3e9e-4229-b74c-2e99bdeb2074" containerID="f0ddc8a656a376dd6b1fdf01655c326edefe101d01d5fa17ac5598ef47313fd9" exitCode=0 Apr 28 19:16:24.192725 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:24.192026 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" event={"ID":"33cdcd08-3e9e-4229-b74c-2e99bdeb2074","Type":"ContainerDied","Data":"f0ddc8a656a376dd6b1fdf01655c326edefe101d01d5fa17ac5598ef47313fd9"} Apr 28 19:16:24.192725 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:24.192590 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:16:24.192725 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:24.192632 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:16:24.206373 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:24.206348 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:16:25.066192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:25.065952 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kbztx"] Apr 28 19:16:25.066392 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:25.066314 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:25.066442 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:25.066424 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:25.066787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:25.066759 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n6x95"] Apr 28 19:16:25.066892 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:25.066880 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:25.066993 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:25.066970 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:25.195615 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:25.195580 2572 generic.go:358] "Generic (PLEG): container finished" podID="33cdcd08-3e9e-4229-b74c-2e99bdeb2074" containerID="74e2c9ce88ba99302275cc1365308c76f751cb3c8a64fe9a713d7cacb0a38b9b" exitCode=0 Apr 28 19:16:25.196076 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:25.195645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" event={"ID":"33cdcd08-3e9e-4229-b74c-2e99bdeb2074","Type":"ContainerDied","Data":"74e2c9ce88ba99302275cc1365308c76f751cb3c8a64fe9a713d7cacb0a38b9b"} Apr 28 19:16:26.199189 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:26.199143 2572 generic.go:358] "Generic (PLEG): container finished" podID="33cdcd08-3e9e-4229-b74c-2e99bdeb2074" containerID="586d67c9fc074984b0d1fe7c36dffd0b85fecfc277f509ab778c51d40455cb1a" exitCode=0 Apr 28 19:16:26.199553 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:26.199266 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" event={"ID":"33cdcd08-3e9e-4229-b74c-2e99bdeb2074","Type":"ContainerDied","Data":"586d67c9fc074984b0d1fe7c36dffd0b85fecfc277f509ab778c51d40455cb1a"} Apr 28 19:16:26.884901 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:26.884679 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:26.884901 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:26.884702 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:26.884901 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:26.884802 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:26.885164 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:26.884966 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:28.886577 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:28.886545 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:28.887098 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:28.886668 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n6x95" podUID="29413dbb-70ef-4e06-8580-98f854320cbb" Apr 28 19:16:28.887098 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:28.886710 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:28.887098 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:28.886855 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:16:30.708089 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.708058 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-36.ec2.internal" event="NodeReady" Apr 28 19:16:30.708573 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.708209 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:30.762192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.762139 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xcxrp"] Apr 28 19:16:30.785132 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.785099 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zv5k4"] Apr 28 19:16:30.785357 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.785325 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.788017 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.787990 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:30.788370 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.788345 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qcg5d\"" Apr 28 19:16:30.788370 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.788366 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:30.796759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.796740 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xcxrp"] Apr 28 19:16:30.796759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.796761 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zv5k4"] Apr 28 19:16:30.796920 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.796853 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:30.799593 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.799574 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:30.799908 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.799864 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:30.799908 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.799902 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rkrph\"" Apr 28 19:16:30.800053 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.799916 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:30.884992 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.884949 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:30.885274 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.884969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:30.887679 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.887656 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:30.888136 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.888116 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:30.888699 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.888680 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:30.888980 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.888962 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kx25j\"" Apr 28 19:16:30.889064 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.888965 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-djdhf\"" Apr 28 19:16:30.891058 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.891037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr8zw\" (UniqueName: \"kubernetes.io/projected/847fde1a-8b63-481f-998b-c119fa746ad5-kube-api-access-hr8zw\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:30.891190 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.891074 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.891190 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.891092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9198734-608e-4a54-8ac4-e6b0cefdc390-tmp-dir\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.891190 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.891118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sr7p\" (UniqueName: \"kubernetes.io/projected/a9198734-608e-4a54-8ac4-e6b0cefdc390-kube-api-access-5sr7p\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.891190 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.891147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:30.891408 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.891272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9198734-608e-4a54-8ac4-e6b0cefdc390-config-volume\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.992460 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.992374 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr8zw\" (UniqueName: \"kubernetes.io/projected/847fde1a-8b63-481f-998b-c119fa746ad5-kube-api-access-hr8zw\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:30.992460 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.992413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.992706 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.992488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9198734-608e-4a54-8ac4-e6b0cefdc390-tmp-dir\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.992706 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:30.992510 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:30.992706 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.992533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sr7p\" (UniqueName: \"kubernetes.io/projected/a9198734-608e-4a54-8ac4-e6b0cefdc390-kube-api-access-5sr7p\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.992706 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.992564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:30.992706 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:30.992582 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls podName:a9198734-608e-4a54-8ac4-e6b0cefdc390 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.492561249 +0000 UTC m=+33.165006353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls") pod "dns-default-xcxrp" (UID: "a9198734-608e-4a54-8ac4-e6b0cefdc390") : secret "dns-default-metrics-tls" not found Apr 28 19:16:30.992706 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:30.992623 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:30.992706 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.992639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9198734-608e-4a54-8ac4-e6b0cefdc390-config-volume\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.992706 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:30.992675 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert podName:847fde1a-8b63-481f-998b-c119fa746ad5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.492661025 +0000 UTC m=+33.165106106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert") pod "ingress-canary-zv5k4" (UID: "847fde1a-8b63-481f-998b-c119fa746ad5") : secret "canary-serving-cert" not found Apr 28 19:16:30.993163 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.992848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a9198734-608e-4a54-8ac4-e6b0cefdc390-tmp-dir\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:30.993163 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:30.993094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a9198734-608e-4a54-8ac4-e6b0cefdc390-config-volume\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:31.005701 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:31.005560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sr7p\" (UniqueName: \"kubernetes.io/projected/a9198734-608e-4a54-8ac4-e6b0cefdc390-kube-api-access-5sr7p\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:31.008041 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:31.008015 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr8zw\" (UniqueName: \"kubernetes.io/projected/847fde1a-8b63-481f-998b-c119fa746ad5-kube-api-access-hr8zw\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:31.497934 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:31.497892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:31.498248 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:31.497995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:31.498248 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:31.498040 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:31.498248 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:31.498095 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:31.498248 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:31.498135 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert podName:847fde1a-8b63-481f-998b-c119fa746ad5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:32.498114019 +0000 UTC m=+34.170559107 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert") pod "ingress-canary-zv5k4" (UID: "847fde1a-8b63-481f-998b-c119fa746ad5") : secret "canary-serving-cert" not found Apr 28 19:16:31.498248 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:31.498152 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls podName:a9198734-608e-4a54-8ac4-e6b0cefdc390 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:32.498146161 +0000 UTC m=+34.170591244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls") pod "dns-default-xcxrp" (UID: "a9198734-608e-4a54-8ac4-e6b0cefdc390") : secret "dns-default-metrics-tls" not found Apr 28 19:16:31.598727 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:31.598677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:16:31.598912 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:31.598852 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:16:31.598977 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:31.598956 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:03.598925278 +0000 UTC m=+65.271370367 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : secret "metrics-daemon-secret" not found Apr 28 19:16:31.699427 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:31.699386 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:31.702125 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:31.702099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55762\" (UniqueName: \"kubernetes.io/projected/29413dbb-70ef-4e06-8580-98f854320cbb-kube-api-access-55762\") pod \"network-check-target-n6x95\" (UID: \"29413dbb-70ef-4e06-8580-98f854320cbb\") " pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:31.803270 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:31.803192 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:32.045019 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:32.044982 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n6x95"] Apr 28 19:16:32.211395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:32.211369 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n6x95" event={"ID":"29413dbb-70ef-4e06-8580-98f854320cbb","Type":"ContainerStarted","Data":"8953c6569512400c7b1b83192aecb7b4f09ecab930209d3823da36e173e14cb5"} Apr 28 19:16:32.505597 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:32.505572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:32.505680 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:32.505667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:32.505750 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:32.505717 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:32.505801 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:32.505766 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:32.505801 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:32.505778 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert podName:847fde1a-8b63-481f-998b-c119fa746ad5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:34.505763119 +0000 UTC m=+36.178208204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert") pod "ingress-canary-zv5k4" (UID: "847fde1a-8b63-481f-998b-c119fa746ad5") : secret "canary-serving-cert" not found Apr 28 19:16:32.505898 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:32.505838 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls podName:a9198734-608e-4a54-8ac4-e6b0cefdc390 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:34.505804192 +0000 UTC m=+36.178249282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls") pod "dns-default-xcxrp" (UID: "a9198734-608e-4a54-8ac4-e6b0cefdc390") : secret "dns-default-metrics-tls" not found Apr 28 19:16:33.216427 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:33.216391 2572 generic.go:358] "Generic (PLEG): container finished" podID="33cdcd08-3e9e-4229-b74c-2e99bdeb2074" containerID="a3418e19d5e8f38254f1531afbe8520f98bc00e1b9e512846a8c665059b6f502" exitCode=0 Apr 28 19:16:33.217296 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:33.216434 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" event={"ID":"33cdcd08-3e9e-4229-b74c-2e99bdeb2074","Type":"ContainerDied","Data":"a3418e19d5e8f38254f1531afbe8520f98bc00e1b9e512846a8c665059b6f502"} Apr 28 19:16:34.222050 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:34.221821 2572 generic.go:358] "Generic (PLEG): container finished" podID="33cdcd08-3e9e-4229-b74c-2e99bdeb2074" containerID="adcf7ab77904924f41d858c6561ed2d2bd3af56f245024ceace2af88f3fef499" exitCode=0 Apr 28 19:16:34.222482 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:34.221903 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" event={"ID":"33cdcd08-3e9e-4229-b74c-2e99bdeb2074","Type":"ContainerDied","Data":"adcf7ab77904924f41d858c6561ed2d2bd3af56f245024ceace2af88f3fef499"} Apr 28 19:16:34.520002 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:34.519916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:34.520002 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:34.519967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:34.520208 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:34.520061 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:34.520208 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:34.520132 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls podName:a9198734-608e-4a54-8ac4-e6b0cefdc390 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:38.520117495 +0000 UTC m=+40.192562579 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls") pod "dns-default-xcxrp" (UID: "a9198734-608e-4a54-8ac4-e6b0cefdc390") : secret "dns-default-metrics-tls" not found Apr 28 19:16:34.520208 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:34.520068 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:34.520358 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:34.520230 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert podName:847fde1a-8b63-481f-998b-c119fa746ad5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:38.520213564 +0000 UTC m=+40.192658648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert") pod "ingress-canary-zv5k4" (UID: "847fde1a-8b63-481f-998b-c119fa746ad5") : secret "canary-serving-cert" not found Apr 28 19:16:36.227340 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:36.227299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n6x95" event={"ID":"29413dbb-70ef-4e06-8580-98f854320cbb","Type":"ContainerStarted","Data":"113f5cc1f386e0b24c2e7a51c2a15ab8f9901f75bf6bdb8a39b906a47436c466"} Apr 28 19:16:36.227879 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:36.227394 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:16:36.230083 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:36.230062 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" event={"ID":"33cdcd08-3e9e-4229-b74c-2e99bdeb2074","Type":"ContainerStarted","Data":"97c5e71d0751b4cbe1af684a4fd9b5c122513bb3de4d2d489c274a1b567c7a79"} Apr 28 19:16:36.263598 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:36.263543 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-n6x95" podStartSLOduration=33.745952697 podStartE2EDuration="37.263528049s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:32.053444315 +0000 UTC m=+33.725889398" lastFinishedPulling="2026-04-28 19:16:35.571019653 +0000 UTC m=+37.243464750" observedRunningTime="2026-04-28 19:16:36.263414597 +0000 UTC m=+37.935859700" watchObservedRunningTime="2026-04-28 19:16:36.263528049 +0000 UTC m=+37.935973153" Apr 28 19:16:36.300144 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:36.300102 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hvlw8" podStartSLOduration=6.201101772 podStartE2EDuration="38.300092314s" podCreationTimestamp="2026-04-28 19:15:58 +0000 UTC" firstStartedPulling="2026-04-28 19:16:00.094271884 +0000 UTC m=+1.766716965" lastFinishedPulling="2026-04-28 19:16:32.193262425 +0000 UTC m=+33.865707507" observedRunningTime="2026-04-28 19:16:36.299557684 +0000 UTC m=+37.972002778" watchObservedRunningTime="2026-04-28 19:16:36.300092314 +0000 UTC m=+37.972537418" Apr 28 19:16:38.549429 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:38.549383 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:38.549429 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:38.549430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:38.549918 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:38.549538 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:38.549918 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:38.549556 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:38.549918 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:38.549590 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert podName:847fde1a-8b63-481f-998b-c119fa746ad5 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:46.54957626 +0000 UTC m=+48.222021341 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert") pod "ingress-canary-zv5k4" (UID: "847fde1a-8b63-481f-998b-c119fa746ad5") : secret "canary-serving-cert" not found Apr 28 19:16:38.549918 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:38.549624 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls podName:a9198734-608e-4a54-8ac4-e6b0cefdc390 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:46.54960422 +0000 UTC m=+48.222049317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls") pod "dns-default-xcxrp" (UID: "a9198734-608e-4a54-8ac4-e6b0cefdc390") : secret "dns-default-metrics-tls" not found Apr 28 19:16:46.603083 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:46.603045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:16:46.603083 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:46.603088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:16:46.603695 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:46.603200 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:46.603695 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:46.603208 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:46.603695 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:46.603277 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert podName:847fde1a-8b63-481f-998b-c119fa746ad5 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:02.603263161 +0000 UTC m=+64.275708245 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert") pod "ingress-canary-zv5k4" (UID: "847fde1a-8b63-481f-998b-c119fa746ad5") : secret "canary-serving-cert" not found Apr 28 19:16:46.603695 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:16:46.603300 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls podName:a9198734-608e-4a54-8ac4-e6b0cefdc390 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:02.603282475 +0000 UTC m=+64.275727557 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls") pod "dns-default-xcxrp" (UID: "a9198734-608e-4a54-8ac4-e6b0cefdc390") : secret "dns-default-metrics-tls" not found Apr 28 19:16:53.009596 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.009563 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw"] Apr 28 19:16:53.014245 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.014224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" Apr 28 19:16:53.020011 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.019987 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 28 19:16:53.021489 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.021468 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-2bgn5\"" Apr 28 19:16:53.021647 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.021498 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 28 19:16:53.021647 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.021470 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 28 19:16:53.021647 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.021556 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 28 19:16:53.025051 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.025032 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm"] Apr 28 19:16:53.028284 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.028265 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw"] Apr 28 19:16:53.028376 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.028365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.032247 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.032227 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 28 19:16:53.032443 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.032430 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 28 19:16:53.032513 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.032464 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 28 19:16:53.032673 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.032655 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 28 19:16:53.055729 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.055699 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm"] Apr 28 19:16:53.146191 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.146146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp2p4\" (UniqueName: \"kubernetes.io/projected/e9010f7c-79ea-4274-8181-cfacfe53f0f8-kube-api-access-bp2p4\") pod \"managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw\" (UID: \"e9010f7c-79ea-4274-8181-cfacfe53f0f8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" Apr 28 19:16:53.146374 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.146214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e9010f7c-79ea-4274-8181-cfacfe53f0f8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw\" (UID: \"e9010f7c-79ea-4274-8181-cfacfe53f0f8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" Apr 28 19:16:53.146374 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.146303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-ca\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.146374 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.146340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/87198a59-22cd-4fa6-b26b-882202b3d33b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.146374 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.146359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.146562 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.146431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-hub\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.146562 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.146452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.146562 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.146526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgz4\" (UniqueName: \"kubernetes.io/projected/87198a59-22cd-4fa6-b26b-882202b3d33b-kube-api-access-qvgz4\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.247197 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgz4\" (UniqueName: \"kubernetes.io/projected/87198a59-22cd-4fa6-b26b-882202b3d33b-kube-api-access-qvgz4\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.247402 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp2p4\" (UniqueName: \"kubernetes.io/projected/e9010f7c-79ea-4274-8181-cfacfe53f0f8-kube-api-access-bp2p4\") pod \"managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw\" (UID: \"e9010f7c-79ea-4274-8181-cfacfe53f0f8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" Apr 28 19:16:53.247402 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e9010f7c-79ea-4274-8181-cfacfe53f0f8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw\" (UID: \"e9010f7c-79ea-4274-8181-cfacfe53f0f8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" Apr 28 19:16:53.247402 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-ca\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.247402 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/87198a59-22cd-4fa6-b26b-882202b3d33b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.247402 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247353 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.247649 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247489 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-hub\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.247649 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.247988 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.247968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/87198a59-22cd-4fa6-b26b-882202b3d33b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.251368 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.251343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-ca\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.251469 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.251403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.251509 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.251478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e9010f7c-79ea-4274-8181-cfacfe53f0f8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw\" (UID: \"e9010f7c-79ea-4274-8181-cfacfe53f0f8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" Apr 28 19:16:53.251543 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.251478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-hub\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.251543 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.251510 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/87198a59-22cd-4fa6-b26b-882202b3d33b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.257232 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.257201 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp2p4\" (UniqueName: \"kubernetes.io/projected/e9010f7c-79ea-4274-8181-cfacfe53f0f8-kube-api-access-bp2p4\") pod \"managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw\" (UID: \"e9010f7c-79ea-4274-8181-cfacfe53f0f8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" Apr 28 19:16:53.258349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.258321 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgz4\" (UniqueName: \"kubernetes.io/projected/87198a59-22cd-4fa6-b26b-882202b3d33b-kube-api-access-qvgz4\") pod \"cluster-proxy-proxy-agent-6d7d75c7df-bl2fm\" (UID: \"87198a59-22cd-4fa6-b26b-882202b3d33b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.340964 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.340867 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" Apr 28 19:16:53.347640 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.347611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:16:53.488360 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.488280 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw"] Apr 28 19:16:53.491874 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:53.491843 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9010f7c_79ea_4274_8181_cfacfe53f0f8.slice/crio-ffda814b2e8a1b7fe99b712e977166c8e5f0fd90303f6d57cd8ff3b3feffb3e8 WatchSource:0}: Error finding container ffda814b2e8a1b7fe99b712e977166c8e5f0fd90303f6d57cd8ff3b3feffb3e8: Status 404 returned error can't find the container with id ffda814b2e8a1b7fe99b712e977166c8e5f0fd90303f6d57cd8ff3b3feffb3e8 Apr 28 19:16:53.515881 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:16:53.515842 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87198a59_22cd_4fa6_b26b_882202b3d33b.slice/crio-a752dafd236f10ce7f83263856301f9e3102de0658681dc52a9a5544a7ca99ca WatchSource:0}: Error finding container a752dafd236f10ce7f83263856301f9e3102de0658681dc52a9a5544a7ca99ca: Status 404 returned error can't find the container with id a752dafd236f10ce7f83263856301f9e3102de0658681dc52a9a5544a7ca99ca Apr 28 19:16:53.519259 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:53.517869 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm"] Apr 28 19:16:54.265634 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:54.265590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" event={"ID":"87198a59-22cd-4fa6-b26b-882202b3d33b","Type":"ContainerStarted","Data":"a752dafd236f10ce7f83263856301f9e3102de0658681dc52a9a5544a7ca99ca"} Apr 28 19:16:54.267494 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:54.267457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" event={"ID":"e9010f7c-79ea-4274-8181-cfacfe53f0f8","Type":"ContainerStarted","Data":"ffda814b2e8a1b7fe99b712e977166c8e5f0fd90303f6d57cd8ff3b3feffb3e8"} Apr 28 19:16:56.218908 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:56.218879 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8r2c" Apr 28 19:16:57.275868 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:57.275834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" event={"ID":"87198a59-22cd-4fa6-b26b-882202b3d33b","Type":"ContainerStarted","Data":"2e97112cef30ef85609eebc9479683f17e824de83f2c78dfe12b4f160e7c6568"} Apr 28 19:16:57.276992 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:57.276969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" event={"ID":"e9010f7c-79ea-4274-8181-cfacfe53f0f8","Type":"ContainerStarted","Data":"621e2fb0f42cb87f78e51fd35f9d27c80ecc68390c79ce36a8ae48a70b333773"} Apr 28 19:16:57.297151 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:57.297073 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bbd48cd9c-nwwzw" podStartSLOduration=1.9099230170000001 podStartE2EDuration="5.297056811s" podCreationTimestamp="2026-04-28 19:16:52 +0000 UTC" firstStartedPulling="2026-04-28 19:16:53.493630157 +0000 UTC m=+55.166075239" lastFinishedPulling="2026-04-28 19:16:56.88076395 +0000 UTC m=+58.553209033" observedRunningTime="2026-04-28 19:16:57.296291904 +0000 UTC m=+58.968737021" watchObservedRunningTime="2026-04-28 19:16:57.297056811 +0000 UTC m=+58.969501915" Apr 28 19:16:59.282849 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:59.282809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" event={"ID":"87198a59-22cd-4fa6-b26b-882202b3d33b","Type":"ContainerStarted","Data":"a8e917ed1da592cf07b651c7a4b374bb2d4ae7b5a34f4fd762a89b6363173234"} Apr 28 19:16:59.282849 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:59.282853 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" event={"ID":"87198a59-22cd-4fa6-b26b-882202b3d33b","Type":"ContainerStarted","Data":"ddac6e67426e888bb20e26e8ba32d5ce5f448e51420c1bd4e52618ac8a9e9734"} Apr 28 19:16:59.306827 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:16:59.306776 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" podStartSLOduration=1.720342949 podStartE2EDuration="7.306762204s" podCreationTimestamp="2026-04-28 19:16:52 +0000 UTC" firstStartedPulling="2026-04-28 19:16:53.520427804 +0000 UTC m=+55.192872889" lastFinishedPulling="2026-04-28 19:16:59.10684705 +0000 UTC m=+60.779292144" observedRunningTime="2026-04-28 19:16:59.306132845 +0000 UTC m=+60.978577951" watchObservedRunningTime="2026-04-28 19:16:59.306762204 +0000 UTC m=+60.979207299" Apr 28 19:17:02.617249 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:17:02.617210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:17:02.617249 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:17:02.617252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:17:02.617794 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:02.617358 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:02.617794 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:02.617371 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:02.617794 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:02.617409 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert podName:847fde1a-8b63-481f-998b-c119fa746ad5 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:34.617395317 +0000 UTC m=+96.289840398 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert") pod "ingress-canary-zv5k4" (UID: "847fde1a-8b63-481f-998b-c119fa746ad5") : secret "canary-serving-cert" not found Apr 28 19:17:02.617794 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:02.617438 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls podName:a9198734-608e-4a54-8ac4-e6b0cefdc390 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:34.617421901 +0000 UTC m=+96.289867002 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls") pod "dns-default-xcxrp" (UID: "a9198734-608e-4a54-8ac4-e6b0cefdc390") : secret "dns-default-metrics-tls" not found Apr 28 19:17:03.623391 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:17:03.623355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:17:03.623847 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:03.623491 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:17:03.623847 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:03.623565 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:07.623544056 +0000 UTC m=+129.295989156 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : secret "metrics-daemon-secret" not found Apr 28 19:17:07.234302 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:17:07.234274 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-n6x95" Apr 28 19:17:34.644326 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:17:34.644287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:17:34.644734 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:17:34.644338 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:17:34.644734 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:34.644451 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:34.644734 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:34.644474 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:34.644734 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:34.644541 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert podName:847fde1a-8b63-481f-998b-c119fa746ad5 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:38.644518937 +0000 UTC m=+160.316964023 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert") pod "ingress-canary-zv5k4" (UID: "847fde1a-8b63-481f-998b-c119fa746ad5") : secret "canary-serving-cert" not found Apr 28 19:17:34.644734 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:17:34.644558 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls podName:a9198734-608e-4a54-8ac4-e6b0cefdc390 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:38.644549947 +0000 UTC m=+160.316995033 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls") pod "dns-default-xcxrp" (UID: "a9198734-608e-4a54-8ac4-e6b0cefdc390") : secret "dns-default-metrics-tls" not found Apr 28 19:18:07.675575 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:07.675534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:18:07.676082 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:18:07.675661 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:18:07.676082 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:18:07.675723 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs podName:a341bf63-a680-4dba-8ba9-7f2a8180d537 nodeName:}" failed. No retries permitted until 2026-04-28 19:20:09.675707854 +0000 UTC m=+251.348152939 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs") pod "network-metrics-daemon-kbztx" (UID: "a341bf63-a680-4dba-8ba9-7f2a8180d537") : secret "metrics-daemon-secret" not found Apr 28 19:18:13.184357 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:13.184326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vzbxw_ccc8edc1-f281-4f6c-b9d7-56c52685d934/dns-node-resolver/0.log" Apr 28 19:18:14.188565 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:14.188530 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wtz8s_b959573c-d823-4400-ac66-5f111c6ec711/node-ca/0.log" Apr 28 19:18:33.364223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.364189 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4m8n9"] Apr 28 19:18:33.367285 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.367263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.370546 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.370521 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:18:33.370662 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.370546 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ql9lz\"" Apr 28 19:18:33.371712 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.371695 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:18:33.371712 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.371701 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:18:33.371865 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.371723 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:18:33.391538 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.389917 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4m8n9"] Apr 28 19:18:33.414449 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.414425 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-549d5fc798-r44zx"] Apr 28 19:18:33.417051 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.417037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.420058 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.420035 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 28 19:18:33.420148 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.420124 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 28 19:18:33.420224 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.420207 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 28 19:18:33.420830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.420813 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lcrng\"" Apr 28 19:18:33.426645 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.426628 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 28 19:18:33.429083 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.429064 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-549d5fc798-r44zx"] Apr 28 19:18:33.462263 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.462243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-data-volume\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.462361 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.462279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.462361 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.462297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.462438 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.462379 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-crio-socket\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.462438 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.462403 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6w6r\" (UniqueName: \"kubernetes.io/projected/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-kube-api-access-c6w6r\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.563164 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.563285 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.563285 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563217 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-installation-pull-secrets\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.563285 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-bound-sa-token\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.563285 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-ca-trust-extracted\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.563409 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-crio-socket\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.563442 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-crio-socket\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.563474 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563461 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-registry-tls\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.563506 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xdzj\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-kube-api-access-9xdzj\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.563546 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6w6r\" (UniqueName: \"kubernetes.io/projected/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-kube-api-access-c6w6r\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.563546 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-registry-certificates\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.563631 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563554 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-trusted-ca\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.563631 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563612 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-data-volume\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.563703 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563636 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-image-registry-private-configuration\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.563703 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.563924 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.563909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-data-volume\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.565606 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.565589 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.572238 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.572217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6w6r\" (UniqueName: \"kubernetes.io/projected/dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea-kube-api-access-c6w6r\") pod \"insights-runtime-extractor-4m8n9\" (UID: \"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea\") " pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.664777 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.664716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-image-registry-private-configuration\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.664777 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.664752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-installation-pull-secrets\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.664777 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.664768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-bound-sa-token\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.664989 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.664810 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-ca-trust-extracted\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.664989 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.664871 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-registry-tls\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.665212 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.665164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xdzj\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-kube-api-access-9xdzj\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.665363 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.665347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-registry-certificates\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.665547 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.665461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-ca-trust-extracted\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.665547 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.665466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-trusted-ca\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.666273 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.666245 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-registry-certificates\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.666599 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.666573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-trusted-ca\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.667480 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.667460 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-installation-pull-secrets\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.667570 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.667491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-registry-tls\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.667636 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.667576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-image-registry-private-configuration\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.673080 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.673059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xdzj\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-kube-api-access-9xdzj\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.674254 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.674237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b94e1fb-0c0f-4aeb-a297-100d7d91353d-bound-sa-token\") pod \"image-registry-549d5fc798-r44zx\" (UID: \"6b94e1fb-0c0f-4aeb-a297-100d7d91353d\") " pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.676121 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.676107 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4m8n9" Apr 28 19:18:33.725931 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.725902 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:33.794029 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.793996 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4m8n9"] Apr 28 19:18:33.796665 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:18:33.796639 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xcxrp" podUID="a9198734-608e-4a54-8ac4-e6b0cefdc390" Apr 28 19:18:33.798454 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:18:33.798428 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8f41b0_57cb_4aa6_8b24_91e2ea5466ea.slice/crio-287b7bc8e1edf111b59298ffe7b52404592807285a63b2e3b6e5495a048b4c06 WatchSource:0}: Error finding container 287b7bc8e1edf111b59298ffe7b52404592807285a63b2e3b6e5495a048b4c06: Status 404 returned error can't find the container with id 287b7bc8e1edf111b59298ffe7b52404592807285a63b2e3b6e5495a048b4c06 Apr 28 19:18:33.806281 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:18:33.806242 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zv5k4" podUID="847fde1a-8b63-481f-998b-c119fa746ad5" Apr 28 19:18:33.849554 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:33.849526 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-549d5fc798-r44zx"] Apr 28 19:18:33.852471 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:18:33.852420 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b94e1fb_0c0f_4aeb_a297_100d7d91353d.slice/crio-621c9f4ac8f4b033a4741010e5e9f1b94258becc783b41503f7ea44eb2474b61 WatchSource:0}: Error finding container 621c9f4ac8f4b033a4741010e5e9f1b94258becc783b41503f7ea44eb2474b61: Status 404 returned error can't find the container with id 621c9f4ac8f4b033a4741010e5e9f1b94258becc783b41503f7ea44eb2474b61 Apr 28 19:18:33.896768 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:18:33.896736 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kbztx" podUID="a341bf63-a680-4dba-8ba9-7f2a8180d537" Apr 28 19:18:34.463330 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:34.463291 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-549d5fc798-r44zx" event={"ID":"6b94e1fb-0c0f-4aeb-a297-100d7d91353d","Type":"ContainerStarted","Data":"e65960fccacf5fb68ddcb9c5d78564a1b862e33c37c2e0ab5ded97cab59f4000"} Apr 28 19:18:34.463330 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:34.463330 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-549d5fc798-r44zx" event={"ID":"6b94e1fb-0c0f-4aeb-a297-100d7d91353d","Type":"ContainerStarted","Data":"621c9f4ac8f4b033a4741010e5e9f1b94258becc783b41503f7ea44eb2474b61"} Apr 28 19:18:34.463745 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:34.463366 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:34.464533 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:34.464516 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcxrp" Apr 28 19:18:34.464533 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:34.464526 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4m8n9" event={"ID":"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea","Type":"ContainerStarted","Data":"dc1cffc7e6b177638ab6c19a272effd94edb8904e18e9648baf2c069d7b48258"} Apr 28 19:18:34.464641 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:34.464548 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4m8n9" event={"ID":"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea","Type":"ContainerStarted","Data":"287b7bc8e1edf111b59298ffe7b52404592807285a63b2e3b6e5495a048b4c06"} Apr 28 19:18:34.484626 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:34.484594 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-549d5fc798-r44zx" podStartSLOduration=1.4845805300000001 podStartE2EDuration="1.48458053s" podCreationTimestamp="2026-04-28 19:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:34.48372251 +0000 UTC m=+156.156167641" watchObservedRunningTime="2026-04-28 19:18:34.48458053 +0000 UTC m=+156.157025634" Apr 28 19:18:35.471509 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:35.471468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4m8n9" event={"ID":"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea","Type":"ContainerStarted","Data":"e394f78473810d84e536e001676da33b874b2fdee53a33771eeef83152782e88"} Apr 28 19:18:36.475082 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:36.475049 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4m8n9" event={"ID":"dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea","Type":"ContainerStarted","Data":"e59fa999d07bbebb45f4f9050661f9748627398d2ff84e932f6ec5187b90d759"} Apr 28 19:18:36.497450 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:36.497410 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4m8n9" podStartSLOduration=1.60959791 podStartE2EDuration="3.497397836s" podCreationTimestamp="2026-04-28 19:18:33 +0000 UTC" firstStartedPulling="2026-04-28 19:18:33.854607449 +0000 UTC m=+155.527052538" lastFinishedPulling="2026-04-28 19:18:35.742407383 +0000 UTC m=+157.414852464" observedRunningTime="2026-04-28 19:18:36.495893894 +0000 UTC m=+158.168338992" watchObservedRunningTime="2026-04-28 19:18:36.497397836 +0000 UTC m=+158.169842940" Apr 28 19:18:38.703377 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:38.703341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:18:38.703843 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:38.703394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:18:38.705674 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:38.705645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9198734-608e-4a54-8ac4-e6b0cefdc390-metrics-tls\") pod \"dns-default-xcxrp\" (UID: \"a9198734-608e-4a54-8ac4-e6b0cefdc390\") " pod="openshift-dns/dns-default-xcxrp" Apr 28 19:18:38.705786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:38.705745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/847fde1a-8b63-481f-998b-c119fa746ad5-cert\") pod \"ingress-canary-zv5k4\" (UID: \"847fde1a-8b63-481f-998b-c119fa746ad5\") " pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:18:38.967595 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:38.967523 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qcg5d\"" Apr 28 19:18:38.975842 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:38.975824 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xcxrp" Apr 28 19:18:39.086043 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:39.086018 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xcxrp"] Apr 28 19:18:39.089240 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:18:39.089205 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9198734_608e_4a54_8ac4_e6b0cefdc390.slice/crio-7cda7a3a24c6e5690168bc7f5241b7c6dde0e867a5fd245c97d8cfd236e43342 WatchSource:0}: Error finding container 7cda7a3a24c6e5690168bc7f5241b7c6dde0e867a5fd245c97d8cfd236e43342: Status 404 returned error can't find the container with id 7cda7a3a24c6e5690168bc7f5241b7c6dde0e867a5fd245c97d8cfd236e43342 Apr 28 19:18:39.485823 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:39.485789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcxrp" event={"ID":"a9198734-608e-4a54-8ac4-e6b0cefdc390","Type":"ContainerStarted","Data":"7cda7a3a24c6e5690168bc7f5241b7c6dde0e867a5fd245c97d8cfd236e43342"} Apr 28 19:18:40.493909 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:40.493872 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcxrp" event={"ID":"a9198734-608e-4a54-8ac4-e6b0cefdc390","Type":"ContainerStarted","Data":"b5264aef5bb3e79ec69b85c876f37dfa091d5ea5260e5148439550a4b3484731"} Apr 28 19:18:40.494284 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:40.493918 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xcxrp" event={"ID":"a9198734-608e-4a54-8ac4-e6b0cefdc390","Type":"ContainerStarted","Data":"19d740ad9b58629cd426a69d32dfaf45950923e7b15e29314dd90a2ac5032a89"} Apr 28 19:18:40.494284 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:40.494014 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xcxrp" Apr 28 19:18:40.512164 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:40.512118 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xcxrp" podStartSLOduration=129.339142036 podStartE2EDuration="2m10.512106781s" podCreationTimestamp="2026-04-28 19:16:30 +0000 UTC" firstStartedPulling="2026-04-28 19:18:39.091093317 +0000 UTC m=+160.763538402" lastFinishedPulling="2026-04-28 19:18:40.264058051 +0000 UTC m=+161.936503147" observedRunningTime="2026-04-28 19:18:40.510920753 +0000 UTC m=+162.183365856" watchObservedRunningTime="2026-04-28 19:18:40.512106781 +0000 UTC m=+162.184551883" Apr 28 19:18:44.884707 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:44.884670 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:18:44.887812 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:44.887792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rkrph\"" Apr 28 19:18:44.895089 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:44.895074 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zv5k4" Apr 28 19:18:45.011276 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:45.011248 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zv5k4"] Apr 28 19:18:45.014547 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:18:45.014524 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod847fde1a_8b63_481f_998b_c119fa746ad5.slice/crio-16bb574c4ab1773c50aa2ae24bb5188daa2b8bf535452a24070698dab8000599 WatchSource:0}: Error finding container 16bb574c4ab1773c50aa2ae24bb5188daa2b8bf535452a24070698dab8000599: Status 404 returned error can't find the container with id 16bb574c4ab1773c50aa2ae24bb5188daa2b8bf535452a24070698dab8000599 Apr 28 19:18:45.508236 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:45.508195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zv5k4" event={"ID":"847fde1a-8b63-481f-998b-c119fa746ad5","Type":"ContainerStarted","Data":"16bb574c4ab1773c50aa2ae24bb5188daa2b8bf535452a24070698dab8000599"} Apr 28 19:18:46.884989 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:46.884955 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:18:47.514041 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:47.514005 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zv5k4" event={"ID":"847fde1a-8b63-481f-998b-c119fa746ad5","Type":"ContainerStarted","Data":"a18f6ef0ee7645593a3d2d252e009dcb5a29463393e27795ac65a8cb2ddec940"} Apr 28 19:18:47.532296 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:47.532253 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zv5k4" podStartSLOduration=135.828649694 podStartE2EDuration="2m17.532241427s" podCreationTimestamp="2026-04-28 19:16:30 +0000 UTC" firstStartedPulling="2026-04-28 19:18:45.016365889 +0000 UTC m=+166.688810971" lastFinishedPulling="2026-04-28 19:18:46.719957618 +0000 UTC m=+168.392402704" observedRunningTime="2026-04-28 19:18:47.530953024 +0000 UTC m=+169.203398127" watchObservedRunningTime="2026-04-28 19:18:47.532241427 +0000 UTC m=+169.204686531" Apr 28 19:18:48.419794 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.419759 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4mgnd"] Apr 28 19:18:48.423353 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.423330 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.427899 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.427879 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:18:48.428188 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.428155 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:18:48.428239 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.428202 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:18:48.428289 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.428161 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:18:48.429023 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.428999 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:18:48.429023 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.429014 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:18:48.429219 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.429128 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xgvrs\"" Apr 28 19:18:48.574877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.574838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx9nv\" (UniqueName: \"kubernetes.io/projected/c766c71a-a924-40ab-bc77-ac0e493e0671-kube-api-access-tx9nv\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.574877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.574877 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.575075 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.574972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c766c71a-a924-40ab-bc77-ac0e493e0671-metrics-client-ca\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.575075 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.575026 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-sys\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.575075 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.575048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.575075 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.575065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-wtmp\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.575244 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.575088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-root\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.575244 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.575130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-textfile\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.575244 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.575199 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-tls\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.675869 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.675778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.675869 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.675821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-wtmp\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676064 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.675949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-wtmp\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676064 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.675958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-root\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676064 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-textfile\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676064 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676049 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-tls\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676288 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676072 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx9nv\" (UniqueName: \"kubernetes.io/projected/c766c71a-a924-40ab-bc77-ac0e493e0671-kube-api-access-tx9nv\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676288 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676101 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676288 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-root\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676288 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c766c71a-a924-40ab-bc77-ac0e493e0671-metrics-client-ca\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676288 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676234 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-sys\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676497 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c766c71a-a924-40ab-bc77-ac0e493e0671-sys\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676497 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-textfile\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676661 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-accelerators-collector-config\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.676705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.676678 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c766c71a-a924-40ab-bc77-ac0e493e0671-metrics-client-ca\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.678240 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.678211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.678620 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.678597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c766c71a-a924-40ab-bc77-ac0e493e0671-node-exporter-tls\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.684606 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.684583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx9nv\" (UniqueName: \"kubernetes.io/projected/c766c71a-a924-40ab-bc77-ac0e493e0671-kube-api-access-tx9nv\") pod \"node-exporter-4mgnd\" (UID: \"c766c71a-a924-40ab-bc77-ac0e493e0671\") " pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.733797 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:48.733762 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4mgnd" Apr 28 19:18:48.742275 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:18:48.742247 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc766c71a_a924_40ab_bc77_ac0e493e0671.slice/crio-267cd978f7fcfd01a5ff0a726036a23ef39e035f0d83b17e16b8efd4d2677332 WatchSource:0}: Error finding container 267cd978f7fcfd01a5ff0a726036a23ef39e035f0d83b17e16b8efd4d2677332: Status 404 returned error can't find the container with id 267cd978f7fcfd01a5ff0a726036a23ef39e035f0d83b17e16b8efd4d2677332 Apr 28 19:18:49.521054 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:49.521017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mgnd" event={"ID":"c766c71a-a924-40ab-bc77-ac0e493e0671","Type":"ContainerStarted","Data":"267cd978f7fcfd01a5ff0a726036a23ef39e035f0d83b17e16b8efd4d2677332"} Apr 28 19:18:50.499861 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:50.499831 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xcxrp" Apr 28 19:18:50.525396 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:50.525365 2572 generic.go:358] "Generic (PLEG): container finished" podID="c766c71a-a924-40ab-bc77-ac0e493e0671" containerID="3f6832171186ce9e6e462524ba1e64e67e5c28819563bdd3f7981ec8288abf94" exitCode=0 Apr 28 19:18:50.525757 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:50.525416 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mgnd" event={"ID":"c766c71a-a924-40ab-bc77-ac0e493e0671","Type":"ContainerDied","Data":"3f6832171186ce9e6e462524ba1e64e67e5c28819563bdd3f7981ec8288abf94"} Apr 28 19:18:51.529450 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:51.529415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mgnd" event={"ID":"c766c71a-a924-40ab-bc77-ac0e493e0671","Type":"ContainerStarted","Data":"44499b3a9a1b31a97cc4f998fcc61fbc27e522c8a1e0cdc97fe371b047e22ffb"} Apr 28 19:18:51.529450 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:51.529452 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4mgnd" event={"ID":"c766c71a-a924-40ab-bc77-ac0e493e0671","Type":"ContainerStarted","Data":"c5e9355787f943a7bfd4acf2e714ff66886ce11ade4635bf930b782a4fbb3219"} Apr 28 19:18:51.576384 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:51.576309 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4mgnd" podStartSLOduration=2.804703554 podStartE2EDuration="3.576290996s" podCreationTimestamp="2026-04-28 19:18:48 +0000 UTC" firstStartedPulling="2026-04-28 19:18:48.744556703 +0000 UTC m=+170.417001785" lastFinishedPulling="2026-04-28 19:18:49.516144129 +0000 UTC m=+171.188589227" observedRunningTime="2026-04-28 19:18:51.575081808 +0000 UTC m=+173.247526913" watchObservedRunningTime="2026-04-28 19:18:51.576290996 +0000 UTC m=+173.248736101" Apr 28 19:18:52.874358 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.874325 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-d558fffc9-9lkpp"] Apr 28 19:18:52.877332 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.877312 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:52.880938 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.880910 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 28 19:18:52.881043 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.880950 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 28 19:18:52.881043 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.880961 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-6n25m\"" Apr 28 19:18:52.882371 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.882352 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4q97gkit286fs\"" Apr 28 19:18:52.882445 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.882376 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 28 19:18:52.882541 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.882526 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 28 19:18:52.899062 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:52.899044 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-d558fffc9-9lkpp"] Apr 28 19:18:53.008672 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.008629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9b8f93a4-9552-476f-b918-baf1e61ac066-audit-log\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.008863 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.008678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-secret-metrics-server-client-certs\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.008863 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.008713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9b8f93a4-9552-476f-b918-baf1e61ac066-metrics-server-audit-profiles\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.008863 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.008734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-client-ca-bundle\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.008863 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.008779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-secret-metrics-server-tls\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.008863 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.008807 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8f93a4-9552-476f-b918-baf1e61ac066-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.009049 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.008881 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkmss\" (UniqueName: \"kubernetes.io/projected/9b8f93a4-9552-476f-b918-baf1e61ac066-kube-api-access-gkmss\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.109923 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.109881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9b8f93a4-9552-476f-b918-baf1e61ac066-metrics-server-audit-profiles\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.109923 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.109928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-client-ca-bundle\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.110192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.109973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-secret-metrics-server-tls\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.110192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.109998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8f93a4-9552-476f-b918-baf1e61ac066-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.110192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.110050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkmss\" (UniqueName: \"kubernetes.io/projected/9b8f93a4-9552-476f-b918-baf1e61ac066-kube-api-access-gkmss\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.110192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.110084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9b8f93a4-9552-476f-b918-baf1e61ac066-audit-log\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.110192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.110110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-secret-metrics-server-client-certs\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.110532 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.110497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9b8f93a4-9552-476f-b918-baf1e61ac066-audit-log\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.110791 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.110767 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8f93a4-9552-476f-b918-baf1e61ac066-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.110903 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.110881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9b8f93a4-9552-476f-b918-baf1e61ac066-metrics-server-audit-profiles\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.112520 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.112490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-secret-metrics-server-tls\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.112606 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.112558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-client-ca-bundle\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.112606 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.112584 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9b8f93a4-9552-476f-b918-baf1e61ac066-secret-metrics-server-client-certs\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.120611 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.120587 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkmss\" (UniqueName: \"kubernetes.io/projected/9b8f93a4-9552-476f-b918-baf1e61ac066-kube-api-access-gkmss\") pod \"metrics-server-d558fffc9-9lkpp\" (UID: \"9b8f93a4-9552-476f-b918-baf1e61ac066\") " pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.186532 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.186440 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:18:53.324021 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.323990 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-d558fffc9-9lkpp"] Apr 28 19:18:53.329371 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:18:53.329319 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8f93a4_9552_476f_b918_baf1e61ac066.slice/crio-4377de1bdfbe3e53385cb08c389c73b67ba6caec88f1c67e24d25b0cea3376e9 WatchSource:0}: Error finding container 4377de1bdfbe3e53385cb08c389c73b67ba6caec88f1c67e24d25b0cea3376e9: Status 404 returned error can't find the container with id 4377de1bdfbe3e53385cb08c389c73b67ba6caec88f1c67e24d25b0cea3376e9 Apr 28 19:18:53.535698 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.535662 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" event={"ID":"9b8f93a4-9552-476f-b918-baf1e61ac066","Type":"ContainerStarted","Data":"4377de1bdfbe3e53385cb08c389c73b67ba6caec88f1c67e24d25b0cea3376e9"} Apr 28 19:18:53.729142 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.729109 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-bcbf69d6c-7952f"] Apr 28 19:18:53.733458 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.733437 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:53.752396 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.752362 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 28 19:18:53.752396 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.752387 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 28 19:18:53.752580 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.752482 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 28 19:18:53.752580 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.752482 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-xb8bv\"" Apr 28 19:18:53.767409 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.767385 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 28 19:18:53.768255 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.768241 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 28 19:18:53.774710 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.774692 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 28 19:18:53.819593 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.819493 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-bcbf69d6c-7952f"] Apr 28 19:18:53.916109 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.916072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-serving-certs-ca-bundle\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:53.916504 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.916114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:53.916504 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.916207 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd72z\" (UniqueName: \"kubernetes.io/projected/dffa9e69-20fa-40fa-897f-dd771bc75935-kube-api-access-kd72z\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:53.916504 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.916239 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-metrics-client-ca\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:53.916504 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.916275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-secret-telemeter-client\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:53.916504 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.916299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:53.916504 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.916326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-federate-client-tls\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:53.916504 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:53.916352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-telemeter-client-tls\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017085 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd72z\" (UniqueName: \"kubernetes.io/projected/dffa9e69-20fa-40fa-897f-dd771bc75935-kube-api-access-kd72z\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017279 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-metrics-client-ca\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017343 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-secret-telemeter-client\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017391 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017444 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-federate-client-tls\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017488 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-telemeter-client-tls\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017541 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-serving-certs-ca-bundle\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017589 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.017908 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.017852 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-metrics-client-ca\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.018439 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.018394 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.018556 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.018535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dffa9e69-20fa-40fa-897f-dd771bc75935-serving-certs-ca-bundle\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.020271 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.020246 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-federate-client-tls\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.020423 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.020405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.020552 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.020533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-telemeter-client-tls\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.020591 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.020535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/dffa9e69-20fa-40fa-897f-dd771bc75935-secret-telemeter-client\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.033259 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.033239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd72z\" (UniqueName: \"kubernetes.io/projected/dffa9e69-20fa-40fa-897f-dd771bc75935-kube-api-access-kd72z\") pod \"telemeter-client-bcbf69d6c-7952f\" (UID: \"dffa9e69-20fa-40fa-897f-dd771bc75935\") " pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.042012 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.041994 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" Apr 28 19:18:54.190366 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.190335 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-bcbf69d6c-7952f"] Apr 28 19:18:54.194075 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:18:54.194050 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddffa9e69_20fa_40fa_897f_dd771bc75935.slice/crio-f431c54c255ebef63b737c67a411fe09a956bb14a66b431dd31e93b7d27a534a WatchSource:0}: Error finding container f431c54c255ebef63b737c67a411fe09a956bb14a66b431dd31e93b7d27a534a: Status 404 returned error can't find the container with id f431c54c255ebef63b737c67a411fe09a956bb14a66b431dd31e93b7d27a534a Apr 28 19:18:54.539045 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:54.539003 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" event={"ID":"dffa9e69-20fa-40fa-897f-dd771bc75935","Type":"ContainerStarted","Data":"f431c54c255ebef63b737c67a411fe09a956bb14a66b431dd31e93b7d27a534a"} Apr 28 19:18:55.476312 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:55.476275 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-549d5fc798-r44zx" Apr 28 19:18:55.543878 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:55.543836 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" event={"ID":"9b8f93a4-9552-476f-b918-baf1e61ac066","Type":"ContainerStarted","Data":"6bb169e9df9a39366a1353c2858a688155c6da9404b2ffc4197a4f580e27b2dc"} Apr 28 19:18:55.573135 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:55.573036 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" podStartSLOduration=2.093243663 podStartE2EDuration="3.573016228s" podCreationTimestamp="2026-04-28 19:18:52 +0000 UTC" firstStartedPulling="2026-04-28 19:18:53.330666895 +0000 UTC m=+175.003111977" lastFinishedPulling="2026-04-28 19:18:54.810439459 +0000 UTC m=+176.482884542" observedRunningTime="2026-04-28 19:18:55.572628545 +0000 UTC m=+177.245073652" watchObservedRunningTime="2026-04-28 19:18:55.573016228 +0000 UTC m=+177.245461333" Apr 28 19:18:56.547647 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:56.547614 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" event={"ID":"dffa9e69-20fa-40fa-897f-dd771bc75935","Type":"ContainerStarted","Data":"aa36e2f090c587ea8f558a3377fe77c89748d1e5395c7c0ce10f360db6724dfa"} Apr 28 19:18:57.551688 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:57.551650 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" event={"ID":"dffa9e69-20fa-40fa-897f-dd771bc75935","Type":"ContainerStarted","Data":"d046e02fd2c9e66642057f9aebf8c59c5abf03ae41bf72c8decd40a81b5a60e7"} Apr 28 19:18:57.551688 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:57.551693 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" event={"ID":"dffa9e69-20fa-40fa-897f-dd771bc75935","Type":"ContainerStarted","Data":"f225a40c38718b6f72cd3329bc21df0acf9f89e1794822907cbdf412ab8668ba"} Apr 28 19:18:57.591071 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:18:57.591020 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-bcbf69d6c-7952f" podStartSLOduration=1.709808013 podStartE2EDuration="4.591005889s" podCreationTimestamp="2026-04-28 19:18:53 +0000 UTC" firstStartedPulling="2026-04-28 19:18:54.196854792 +0000 UTC m=+175.869299878" lastFinishedPulling="2026-04-28 19:18:57.078052667 +0000 UTC m=+178.750497754" observedRunningTime="2026-04-28 19:18:57.586594631 +0000 UTC m=+179.259039737" watchObservedRunningTime="2026-04-28 19:18:57.591005889 +0000 UTC m=+179.263451018" Apr 28 19:19:13.187426 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:13.187381 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:19:13.187426 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:13.187433 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:19:23.349370 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:23.349299 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" podUID="87198a59-22cd-4fa6-b26b-882202b3d33b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:19:33.192975 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:33.192934 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:19:33.197292 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:33.197263 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-d558fffc9-9lkpp" Apr 28 19:19:33.349228 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:33.349157 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" podUID="87198a59-22cd-4fa6-b26b-882202b3d33b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:19:43.348846 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:43.348806 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" podUID="87198a59-22cd-4fa6-b26b-882202b3d33b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 28 19:19:43.349281 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:43.348888 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" Apr 28 19:19:43.349500 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:43.349465 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a8e917ed1da592cf07b651c7a4b374bb2d4ae7b5a34f4fd762a89b6363173234"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 28 19:19:43.349557 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:43.349540 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" podUID="87198a59-22cd-4fa6-b26b-882202b3d33b" containerName="service-proxy" containerID="cri-o://a8e917ed1da592cf07b651c7a4b374bb2d4ae7b5a34f4fd762a89b6363173234" gracePeriod=30 Apr 28 19:19:43.667025 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:43.666917 2572 generic.go:358] "Generic (PLEG): container finished" podID="87198a59-22cd-4fa6-b26b-882202b3d33b" containerID="a8e917ed1da592cf07b651c7a4b374bb2d4ae7b5a34f4fd762a89b6363173234" exitCode=2 Apr 28 19:19:43.667025 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:43.666981 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" event={"ID":"87198a59-22cd-4fa6-b26b-882202b3d33b","Type":"ContainerDied","Data":"a8e917ed1da592cf07b651c7a4b374bb2d4ae7b5a34f4fd762a89b6363173234"} Apr 28 19:19:43.667025 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:19:43.667020 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6d7d75c7df-bl2fm" event={"ID":"87198a59-22cd-4fa6-b26b-882202b3d33b","Type":"ContainerStarted","Data":"c048c796da5987b0c0f448ed86aa2b73963ea21d1921a9b6b2a02e452fa9e929"} Apr 28 19:20:09.721676 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:09.721624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:20:09.724108 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:09.724083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a341bf63-a680-4dba-8ba9-7f2a8180d537-metrics-certs\") pod \"network-metrics-daemon-kbztx\" (UID: \"a341bf63-a680-4dba-8ba9-7f2a8180d537\") " pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:20:09.988678 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:09.988593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-djdhf\"" Apr 28 19:20:09.996117 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:09.996095 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbztx" Apr 28 19:20:10.111606 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:10.111572 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kbztx"] Apr 28 19:20:10.114939 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:20:10.114907 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda341bf63_a680_4dba_8ba9_7f2a8180d537.slice/crio-7ccab018e334b229e0fe27a702bae8ac0fec797e06045c4fc4c509894fcf0610 WatchSource:0}: Error finding container 7ccab018e334b229e0fe27a702bae8ac0fec797e06045c4fc4c509894fcf0610: Status 404 returned error can't find the container with id 7ccab018e334b229e0fe27a702bae8ac0fec797e06045c4fc4c509894fcf0610 Apr 28 19:20:10.733428 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:10.733354 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbztx" event={"ID":"a341bf63-a680-4dba-8ba9-7f2a8180d537","Type":"ContainerStarted","Data":"7ccab018e334b229e0fe27a702bae8ac0fec797e06045c4fc4c509894fcf0610"} Apr 28 19:20:11.737193 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:11.737142 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbztx" event={"ID":"a341bf63-a680-4dba-8ba9-7f2a8180d537","Type":"ContainerStarted","Data":"9340fbe5a3ca44d3a4a1ca4be7bca9f3104b6c7d22fb89530fe64aac3e67fb4b"} Apr 28 19:20:11.737193 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:11.737194 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbztx" event={"ID":"a341bf63-a680-4dba-8ba9-7f2a8180d537","Type":"ContainerStarted","Data":"07f7ce3ad8cff41e998c62c8c5208934f639ad1a2872af07afc5e4bc8b87c052"} Apr 28 19:20:11.758000 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:11.757949 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kbztx" podStartSLOduration=251.850762593 podStartE2EDuration="4m12.757932s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:20:10.116780687 +0000 UTC m=+251.789225770" lastFinishedPulling="2026-04-28 19:20:11.023950092 +0000 UTC m=+252.696395177" observedRunningTime="2026-04-28 19:20:11.757476821 +0000 UTC m=+253.429921938" watchObservedRunningTime="2026-04-28 19:20:11.757932 +0000 UTC m=+253.430377104" Apr 28 19:20:30.119243 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.119202 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mdlds"] Apr 28 19:20:30.122406 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.122389 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.125036 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.124995 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:20:30.134659 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.134637 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mdlds"] Apr 28 19:20:30.173496 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.173457 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-dbus\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.173678 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.173520 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-original-pull-secret\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.173678 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.173620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-kubelet-config\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.274084 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.274034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-original-pull-secret\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.274084 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.274091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-kubelet-config\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.274387 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.274133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-dbus\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.274387 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.274251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-kubelet-config\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.274387 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.274303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-dbus\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.276406 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.276384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/98dda8bf-ccac-4d10-a55e-8d1b2d3121b1-original-pull-secret\") pod \"global-pull-secret-syncer-mdlds\" (UID: \"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1\") " pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.431829 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.431730 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mdlds" Apr 28 19:20:30.566812 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.566771 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mdlds"] Apr 28 19:20:30.567373 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:20:30.567347 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98dda8bf_ccac_4d10_a55e_8d1b2d3121b1.slice/crio-f920815ce5969af3c6837f20737507ec782e0587fbc70502f438160ae628a2af WatchSource:0}: Error finding container f920815ce5969af3c6837f20737507ec782e0587fbc70502f438160ae628a2af: Status 404 returned error can't find the container with id f920815ce5969af3c6837f20737507ec782e0587fbc70502f438160ae628a2af Apr 28 19:20:30.790615 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:30.790570 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mdlds" event={"ID":"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1","Type":"ContainerStarted","Data":"f920815ce5969af3c6837f20737507ec782e0587fbc70502f438160ae628a2af"} Apr 28 19:20:35.809596 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:35.809556 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mdlds" event={"ID":"98dda8bf-ccac-4d10-a55e-8d1b2d3121b1","Type":"ContainerStarted","Data":"483e74dfaaf78632fbd22a42b07251fd5a5e99837921b8ce34a6070b9d8bb23e"} Apr 28 19:20:35.826380 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:35.826331 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mdlds" podStartSLOduration=0.997659383 podStartE2EDuration="5.826315073s" podCreationTimestamp="2026-04-28 19:20:30 +0000 UTC" firstStartedPulling="2026-04-28 19:20:30.569207655 +0000 UTC m=+272.241652736" lastFinishedPulling="2026-04-28 19:20:35.397863329 +0000 UTC m=+277.070308426" observedRunningTime="2026-04-28 19:20:35.825644936 +0000 UTC m=+277.498090053" watchObservedRunningTime="2026-04-28 19:20:35.826315073 +0000 UTC m=+277.498760207" Apr 28 19:20:58.782871 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:20:58.782840 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:22:24.264085 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.264051 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-kwt2h"] Apr 28 19:22:24.267041 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.267024 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:24.273095 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.273073 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 28 19:22:24.273796 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.273772 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vtdql\"" Apr 28 19:22:24.273898 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.273798 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:22:24.273898 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.273845 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:22:24.280765 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.280745 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-kwt2h"] Apr 28 19:22:24.441194 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.441139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fed37cca-a5e0-405d-bdee-c9bb70721562-data\") pod \"seaweedfs-86cc847c5c-kwt2h\" (UID: \"fed37cca-a5e0-405d-bdee-c9bb70721562\") " pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:24.441375 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.441233 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dprxz\" (UniqueName: \"kubernetes.io/projected/fed37cca-a5e0-405d-bdee-c9bb70721562-kube-api-access-dprxz\") pod \"seaweedfs-86cc847c5c-kwt2h\" (UID: \"fed37cca-a5e0-405d-bdee-c9bb70721562\") " pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:24.542550 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.542456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fed37cca-a5e0-405d-bdee-c9bb70721562-data\") pod \"seaweedfs-86cc847c5c-kwt2h\" (UID: \"fed37cca-a5e0-405d-bdee-c9bb70721562\") " pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:24.542550 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.542533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dprxz\" (UniqueName: \"kubernetes.io/projected/fed37cca-a5e0-405d-bdee-c9bb70721562-kube-api-access-dprxz\") pod \"seaweedfs-86cc847c5c-kwt2h\" (UID: \"fed37cca-a5e0-405d-bdee-c9bb70721562\") " pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:24.542841 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.542818 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fed37cca-a5e0-405d-bdee-c9bb70721562-data\") pod \"seaweedfs-86cc847c5c-kwt2h\" (UID: \"fed37cca-a5e0-405d-bdee-c9bb70721562\") " pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:24.551274 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.551250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dprxz\" (UniqueName: \"kubernetes.io/projected/fed37cca-a5e0-405d-bdee-c9bb70721562-kube-api-access-dprxz\") pod \"seaweedfs-86cc847c5c-kwt2h\" (UID: \"fed37cca-a5e0-405d-bdee-c9bb70721562\") " pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:24.577081 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.577050 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:24.725513 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.725479 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-kwt2h"] Apr 28 19:22:24.728648 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:22:24.728619 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed37cca_a5e0_405d_bdee_c9bb70721562.slice/crio-4e6f8329ce751ab9c606a5d9e6d68ad1298bd80cb85d554fc0d273ccd1a47216 WatchSource:0}: Error finding container 4e6f8329ce751ab9c606a5d9e6d68ad1298bd80cb85d554fc0d273ccd1a47216: Status 404 returned error can't find the container with id 4e6f8329ce751ab9c606a5d9e6d68ad1298bd80cb85d554fc0d273ccd1a47216 Apr 28 19:22:24.729885 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:24.729866 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:22:25.092910 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:25.092870 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-kwt2h" event={"ID":"fed37cca-a5e0-405d-bdee-c9bb70721562","Type":"ContainerStarted","Data":"4e6f8329ce751ab9c606a5d9e6d68ad1298bd80cb85d554fc0d273ccd1a47216"} Apr 28 19:22:28.104690 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:28.104649 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-kwt2h" event={"ID":"fed37cca-a5e0-405d-bdee-c9bb70721562","Type":"ContainerStarted","Data":"e0aa3dbae8c3a3578fc8e5f98f342cf4d40e283d7ba806ac63e36bc25344672b"} Apr 28 19:22:28.105214 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:28.104808 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:34.109899 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:34.109866 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-kwt2h" Apr 28 19:22:34.128398 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:22:34.128345 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-kwt2h" podStartSLOduration=7.467631462 podStartE2EDuration="10.128328817s" podCreationTimestamp="2026-04-28 19:22:24 +0000 UTC" firstStartedPulling="2026-04-28 19:22:24.729985614 +0000 UTC m=+386.402430696" lastFinishedPulling="2026-04-28 19:22:27.390682954 +0000 UTC m=+389.063128051" observedRunningTime="2026-04-28 19:22:28.131942546 +0000 UTC m=+389.804387651" watchObservedRunningTime="2026-04-28 19:22:34.128328817 +0000 UTC m=+395.800773921" Apr 28 19:23:34.985222 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:34.985191 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-mx5cm"] Apr 28 19:23:34.988168 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:34.988140 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:34.991800 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:34.991780 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-xrr88\"" Apr 28 19:23:34.992733 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:34.992718 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 28 19:23:35.001730 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.001562 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-gcsqh"] Apr 28 19:23:35.004947 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.004923 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mx5cm"] Apr 28 19:23:35.005072 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.005044 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:35.007799 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.007783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-x4dzs\"" Apr 28 19:23:35.008200 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.008166 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 28 19:23:35.016072 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.016051 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gcsqh"] Apr 28 19:23:35.050644 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.050610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-cert\") pod \"odh-model-controller-696fc77849-gcsqh\" (UID: \"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce\") " pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:35.050803 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.050658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqbdj\" (UniqueName: \"kubernetes.io/projected/15442544-6447-448d-9c8d-67ac00ac5bc8-kube-api-access-wqbdj\") pod \"model-serving-api-86f7b4b499-mx5cm\" (UID: \"15442544-6447-448d-9c8d-67ac00ac5bc8\") " pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:35.050803 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.050728 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfwmf\" (UniqueName: \"kubernetes.io/projected/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-kube-api-access-dfwmf\") pod \"odh-model-controller-696fc77849-gcsqh\" (UID: \"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce\") " pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:35.050803 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.050763 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15442544-6447-448d-9c8d-67ac00ac5bc8-tls-certs\") pod \"model-serving-api-86f7b4b499-mx5cm\" (UID: \"15442544-6447-448d-9c8d-67ac00ac5bc8\") " pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:35.152102 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.152064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfwmf\" (UniqueName: \"kubernetes.io/projected/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-kube-api-access-dfwmf\") pod \"odh-model-controller-696fc77849-gcsqh\" (UID: \"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce\") " pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:35.152102 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.152104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15442544-6447-448d-9c8d-67ac00ac5bc8-tls-certs\") pod \"model-serving-api-86f7b4b499-mx5cm\" (UID: \"15442544-6447-448d-9c8d-67ac00ac5bc8\") " pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:35.152317 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.152297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-cert\") pod \"odh-model-controller-696fc77849-gcsqh\" (UID: \"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce\") " pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:35.152387 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.152370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqbdj\" (UniqueName: \"kubernetes.io/projected/15442544-6447-448d-9c8d-67ac00ac5bc8-kube-api-access-wqbdj\") pod \"model-serving-api-86f7b4b499-mx5cm\" (UID: \"15442544-6447-448d-9c8d-67ac00ac5bc8\") " pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:35.152443 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:23:35.152426 2572 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 28 19:23:35.152506 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:23:35.152497 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-cert podName:dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce nodeName:}" failed. No retries permitted until 2026-04-28 19:23:35.652479793 +0000 UTC m=+457.324924886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-cert") pod "odh-model-controller-696fc77849-gcsqh" (UID: "dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce") : secret "odh-model-controller-webhook-cert" not found Apr 28 19:23:35.154569 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.154538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15442544-6447-448d-9c8d-67ac00ac5bc8-tls-certs\") pod \"model-serving-api-86f7b4b499-mx5cm\" (UID: \"15442544-6447-448d-9c8d-67ac00ac5bc8\") " pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:35.163945 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.163925 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqbdj\" (UniqueName: \"kubernetes.io/projected/15442544-6447-448d-9c8d-67ac00ac5bc8-kube-api-access-wqbdj\") pod \"model-serving-api-86f7b4b499-mx5cm\" (UID: \"15442544-6447-448d-9c8d-67ac00ac5bc8\") " pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:35.164525 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.164501 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfwmf\" (UniqueName: \"kubernetes.io/projected/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-kube-api-access-dfwmf\") pod \"odh-model-controller-696fc77849-gcsqh\" (UID: \"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce\") " pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:35.298230 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.298134 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:35.419908 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.419881 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mx5cm"] Apr 28 19:23:35.422671 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:23:35.422650 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15442544_6447_448d_9c8d_67ac00ac5bc8.slice/crio-431de1faf45f5926b98cf13c6086cd8d4a93bf507d4bc6c350793434d5c7b110 WatchSource:0}: Error finding container 431de1faf45f5926b98cf13c6086cd8d4a93bf507d4bc6c350793434d5c7b110: Status 404 returned error can't find the container with id 431de1faf45f5926b98cf13c6086cd8d4a93bf507d4bc6c350793434d5c7b110 Apr 28 19:23:35.657503 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.657417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-cert\") pod \"odh-model-controller-696fc77849-gcsqh\" (UID: \"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce\") " pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:35.659733 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.659702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce-cert\") pod \"odh-model-controller-696fc77849-gcsqh\" (UID: \"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce\") " pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:35.914261 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:35.914168 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:36.051695 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:36.051637 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gcsqh"] Apr 28 19:23:36.055728 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:23:36.055693 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc1e6ebe_07bc_4bb8_be03_ec0b5719f1ce.slice/crio-968dca981235da765845365200e7ec172dd21ceed12b7090e646666d9bdea911 WatchSource:0}: Error finding container 968dca981235da765845365200e7ec172dd21ceed12b7090e646666d9bdea911: Status 404 returned error can't find the container with id 968dca981235da765845365200e7ec172dd21ceed12b7090e646666d9bdea911 Apr 28 19:23:36.281844 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:36.281792 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gcsqh" event={"ID":"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce","Type":"ContainerStarted","Data":"968dca981235da765845365200e7ec172dd21ceed12b7090e646666d9bdea911"} Apr 28 19:23:36.283080 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:36.283047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mx5cm" event={"ID":"15442544-6447-448d-9c8d-67ac00ac5bc8","Type":"ContainerStarted","Data":"431de1faf45f5926b98cf13c6086cd8d4a93bf507d4bc6c350793434d5c7b110"} Apr 28 19:23:39.292732 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:39.292696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gcsqh" event={"ID":"dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce","Type":"ContainerStarted","Data":"47e4cf3846da516daea191a8ab1619da583673394007287684d6bc50cfe83606"} Apr 28 19:23:39.293195 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:39.292800 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:39.293956 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:39.293935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mx5cm" event={"ID":"15442544-6447-448d-9c8d-67ac00ac5bc8","Type":"ContainerStarted","Data":"f996a6800f7ce8da60bc07785eaecfc1a286984c31b05d5a226062080a5a0a7f"} Apr 28 19:23:39.294052 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:39.294032 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:23:39.310475 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:39.310387 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-gcsqh" podStartSLOduration=2.273452564 podStartE2EDuration="5.310372915s" podCreationTimestamp="2026-04-28 19:23:34 +0000 UTC" firstStartedPulling="2026-04-28 19:23:36.057285099 +0000 UTC m=+457.729730187" lastFinishedPulling="2026-04-28 19:23:39.094205454 +0000 UTC m=+460.766650538" observedRunningTime="2026-04-28 19:23:39.309644383 +0000 UTC m=+460.982089499" watchObservedRunningTime="2026-04-28 19:23:39.310372915 +0000 UTC m=+460.982818022" Apr 28 19:23:39.330572 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:39.330524 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-mx5cm" podStartSLOduration=1.706833952 podStartE2EDuration="5.33051121s" podCreationTimestamp="2026-04-28 19:23:34 +0000 UTC" firstStartedPulling="2026-04-28 19:23:35.424452148 +0000 UTC m=+457.096897229" lastFinishedPulling="2026-04-28 19:23:39.048129393 +0000 UTC m=+460.720574487" observedRunningTime="2026-04-28 19:23:39.330252663 +0000 UTC m=+461.002697763" watchObservedRunningTime="2026-04-28 19:23:39.33051121 +0000 UTC m=+461.002956314" Apr 28 19:23:50.299725 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:50.299691 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-gcsqh" Apr 28 19:23:50.301534 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:23:50.301517 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-mx5cm" Apr 28 19:24:10.619083 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.619049 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn"] Apr 28 19:24:10.625475 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.625455 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.628625 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.628603 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5h9ft\"" Apr 28 19:24:10.628747 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.628637 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 28 19:24:10.628747 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.628607 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\"" Apr 28 19:24:10.628747 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.628698 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:24:10.629868 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.629850 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-199b4-predictor-serving-cert\"" Apr 28 19:24:10.642924 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.642895 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn"] Apr 28 19:24:10.739500 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.739464 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6149254-214c-471c-8ec1-826372d2bb1d-isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.739500 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.739502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255q4\" (UniqueName: \"kubernetes.io/projected/f6149254-214c-471c-8ec1-826372d2bb1d-kube-api-access-255q4\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.739705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.739531 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6149254-214c-471c-8ec1-826372d2bb1d-proxy-tls\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.739705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.739575 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6149254-214c-471c-8ec1-826372d2bb1d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.840981 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.840948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6149254-214c-471c-8ec1-826372d2bb1d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.841136 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.841050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6149254-214c-471c-8ec1-826372d2bb1d-isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.841136 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.841078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-255q4\" (UniqueName: \"kubernetes.io/projected/f6149254-214c-471c-8ec1-826372d2bb1d-kube-api-access-255q4\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.841136 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.841110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6149254-214c-471c-8ec1-826372d2bb1d-proxy-tls\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.841325 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.841236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6149254-214c-471c-8ec1-826372d2bb1d-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.841684 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.841665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6149254-214c-471c-8ec1-826372d2bb1d-isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.843510 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.843486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6149254-214c-471c-8ec1-826372d2bb1d-proxy-tls\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.851673 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.851652 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-255q4\" (UniqueName: \"kubernetes.io/projected/f6149254-214c-471c-8ec1-826372d2bb1d-kube-api-access-255q4\") pod \"isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:10.935965 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:10.935895 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:11.081478 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:11.081455 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn"] Apr 28 19:24:11.083463 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:24:11.083430 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6149254_214c_471c_8ec1_826372d2bb1d.slice/crio-7b3eb6a1eb478a6ec1d319e6fc23ce2ba6603c7664b0a0ef7f5967c1ac63d14b WatchSource:0}: Error finding container 7b3eb6a1eb478a6ec1d319e6fc23ce2ba6603c7664b0a0ef7f5967c1ac63d14b: Status 404 returned error can't find the container with id 7b3eb6a1eb478a6ec1d319e6fc23ce2ba6603c7664b0a0ef7f5967c1ac63d14b Apr 28 19:24:11.380482 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:11.380447 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerStarted","Data":"7b3eb6a1eb478a6ec1d319e6fc23ce2ba6603c7664b0a0ef7f5967c1ac63d14b"} Apr 28 19:24:15.394574 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:15.394540 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerStarted","Data":"582b217662a21d502da0e75dc8f0af89816fc296516e9c24bee42d23e656ed16"} Apr 28 19:24:18.403758 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:18.403720 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6149254-214c-471c-8ec1-826372d2bb1d" containerID="582b217662a21d502da0e75dc8f0af89816fc296516e9c24bee42d23e656ed16" exitCode=0 Apr 28 19:24:18.404122 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:18.403797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerDied","Data":"582b217662a21d502da0e75dc8f0af89816fc296516e9c24bee42d23e656ed16"} Apr 28 19:24:32.454596 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:32.454550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerStarted","Data":"373ece8a05e79fe74181a8c8c18dd0cc78cbe0e2a599b086dbeaba18650ddcaa"} Apr 28 19:24:34.462758 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:34.462723 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerStarted","Data":"49b6c4318f1b4dc58a31b2766f76d5dfaf422be7b636389136d2799126143554"} Apr 28 19:24:36.470712 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:36.470673 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerStarted","Data":"d46e87290243a5367960d88165a070c43ee67781d9dcb61d32cfcfeae8aa603c"} Apr 28 19:24:36.471092 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:36.470909 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:36.493766 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:36.493691 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podStartSLOduration=1.186611477 podStartE2EDuration="26.493675943s" podCreationTimestamp="2026-04-28 19:24:10 +0000 UTC" firstStartedPulling="2026-04-28 19:24:11.085399483 +0000 UTC m=+492.757844565" lastFinishedPulling="2026-04-28 19:24:36.392463932 +0000 UTC m=+518.064909031" observedRunningTime="2026-04-28 19:24:36.491751293 +0000 UTC m=+518.164196398" watchObservedRunningTime="2026-04-28 19:24:36.493675943 +0000 UTC m=+518.166121046" Apr 28 19:24:37.473333 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:37.473292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:37.473709 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:37.473343 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:37.474841 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:37.474807 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:24:37.475523 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:37.475489 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:24:38.476146 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:38.476100 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:24:38.476560 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:38.476520 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:24:43.481034 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:43.481004 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:24:43.481643 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:43.481607 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:24:43.482025 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:43.482004 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:24:53.481497 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:53.481459 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:24:53.481916 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:24:53.481873 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:03.481501 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:03.481457 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:25:03.481943 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:03.481922 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:13.481666 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:13.481618 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:25:13.482130 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:13.482025 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:23.481983 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:23.481939 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:25:23.482497 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:23.482445 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:33.482034 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:33.481996 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:25:33.482457 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:33.482427 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:25:43.482350 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:43.482312 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:25:43.482984 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:43.482960 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:25:55.676898 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:55.676859 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn"] Apr 28 19:25:55.677454 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:55.677253 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" containerID="cri-o://373ece8a05e79fe74181a8c8c18dd0cc78cbe0e2a599b086dbeaba18650ddcaa" gracePeriod=30 Apr 28 19:25:55.677454 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:55.677274 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" containerID="cri-o://49b6c4318f1b4dc58a31b2766f76d5dfaf422be7b636389136d2799126143554" gracePeriod=30 Apr 28 19:25:55.677454 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:55.677281 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" containerID="cri-o://d46e87290243a5367960d88165a070c43ee67781d9dcb61d32cfcfeae8aa603c" gracePeriod=30 Apr 28 19:25:56.483346 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.483313 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf"] Apr 28 19:25:56.486601 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.486580 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.490566 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.490536 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\"" Apr 28 19:25:56.490687 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.490625 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-357ae-predictor-serving-cert\"" Apr 28 19:25:56.503231 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.503198 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf"] Apr 28 19:25:56.587538 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.587493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da2c2459-c227-459c-ae3e-9783f5fa460b-isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.587734 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.587546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da2c2459-c227-459c-ae3e-9783f5fa460b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.587734 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.587592 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da2c2459-c227-459c-ae3e-9783f5fa460b-proxy-tls\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.587734 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.587631 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtffl\" (UniqueName: \"kubernetes.io/projected/da2c2459-c227-459c-ae3e-9783f5fa460b-kube-api-access-vtffl\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.688353 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.688317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da2c2459-c227-459c-ae3e-9783f5fa460b-proxy-tls\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.688849 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.688376 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtffl\" (UniqueName: \"kubernetes.io/projected/da2c2459-c227-459c-ae3e-9783f5fa460b-kube-api-access-vtffl\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.688849 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.688417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da2c2459-c227-459c-ae3e-9783f5fa460b-isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.688849 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.688439 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da2c2459-c227-459c-ae3e-9783f5fa460b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.689028 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.688983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da2c2459-c227-459c-ae3e-9783f5fa460b-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.689192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.689149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da2c2459-c227-459c-ae3e-9783f5fa460b-isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.690921 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.690893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da2c2459-c227-459c-ae3e-9783f5fa460b-proxy-tls\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.695586 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.695555 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6149254-214c-471c-8ec1-826372d2bb1d" containerID="49b6c4318f1b4dc58a31b2766f76d5dfaf422be7b636389136d2799126143554" exitCode=2 Apr 28 19:25:56.695723 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.695635 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerDied","Data":"49b6c4318f1b4dc58a31b2766f76d5dfaf422be7b636389136d2799126143554"} Apr 28 19:25:56.697360 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.697335 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtffl\" (UniqueName: \"kubernetes.io/projected/da2c2459-c227-459c-ae3e-9783f5fa460b-kube-api-access-vtffl\") pod \"isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.796688 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.796583 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:25:56.934041 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:56.934008 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf"] Apr 28 19:25:56.936631 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:25:56.936601 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda2c2459_c227_459c_ae3e_9783f5fa460b.slice/crio-e454eadc4dd3a20ae1769a6dedb267d4d2fa3702ec3792929d82fead87f43ebe WatchSource:0}: Error finding container e454eadc4dd3a20ae1769a6dedb267d4d2fa3702ec3792929d82fead87f43ebe: Status 404 returned error can't find the container with id e454eadc4dd3a20ae1769a6dedb267d4d2fa3702ec3792929d82fead87f43ebe Apr 28 19:25:57.035640 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.035606 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj"] Apr 28 19:25:57.039001 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.038980 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.042380 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.042350 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-357ae-predictor-serving-cert\"" Apr 28 19:25:57.042380 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.042378 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\"" Apr 28 19:25:57.055143 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.055063 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj"] Apr 28 19:25:57.093851 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.093814 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c348e295-37a9-4e1d-a806-e255f953dbbe-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.094037 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.093876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c348e295-37a9-4e1d-a806-e255f953dbbe-proxy-tls\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.094037 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.093960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c348e295-37a9-4e1d-a806-e255f953dbbe-isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.094153 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.094078 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlt2x\" (UniqueName: \"kubernetes.io/projected/c348e295-37a9-4e1d-a806-e255f953dbbe-kube-api-access-mlt2x\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.194946 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.194896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlt2x\" (UniqueName: \"kubernetes.io/projected/c348e295-37a9-4e1d-a806-e255f953dbbe-kube-api-access-mlt2x\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.195135 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.194984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c348e295-37a9-4e1d-a806-e255f953dbbe-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.195135 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.195029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c348e295-37a9-4e1d-a806-e255f953dbbe-proxy-tls\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.195135 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.195064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c348e295-37a9-4e1d-a806-e255f953dbbe-isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.195443 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.195422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c348e295-37a9-4e1d-a806-e255f953dbbe-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.195887 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.195849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c348e295-37a9-4e1d-a806-e255f953dbbe-isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.197796 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.197769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c348e295-37a9-4e1d-a806-e255f953dbbe-proxy-tls\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.203190 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.203141 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlt2x\" (UniqueName: \"kubernetes.io/projected/c348e295-37a9-4e1d-a806-e255f953dbbe-kube-api-access-mlt2x\") pod \"isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.350248 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.350129 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:25:57.480098 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.480065 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj"] Apr 28 19:25:57.483503 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:25:57.483470 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc348e295_37a9_4e1d_a806_e255f953dbbe.slice/crio-8bb13f8b9f6e6d951e5884b9c023102612fec0045bb27e6391ec81e1d337bef2 WatchSource:0}: Error finding container 8bb13f8b9f6e6d951e5884b9c023102612fec0045bb27e6391ec81e1d337bef2: Status 404 returned error can't find the container with id 8bb13f8b9f6e6d951e5884b9c023102612fec0045bb27e6391ec81e1d337bef2 Apr 28 19:25:57.700250 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.700210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" event={"ID":"da2c2459-c227-459c-ae3e-9783f5fa460b","Type":"ContainerStarted","Data":"e519b5faa07ed45c094b9d609b81761e13acfb900fa7e749e31397f67245bffd"} Apr 28 19:25:57.700250 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.700249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" event={"ID":"da2c2459-c227-459c-ae3e-9783f5fa460b","Type":"ContainerStarted","Data":"e454eadc4dd3a20ae1769a6dedb267d4d2fa3702ec3792929d82fead87f43ebe"} Apr 28 19:25:57.701703 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.701672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" event={"ID":"c348e295-37a9-4e1d-a806-e255f953dbbe","Type":"ContainerStarted","Data":"6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638"} Apr 28 19:25:57.701703 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:57.701706 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" event={"ID":"c348e295-37a9-4e1d-a806-e255f953dbbe","Type":"ContainerStarted","Data":"8bb13f8b9f6e6d951e5884b9c023102612fec0045bb27e6391ec81e1d337bef2"} Apr 28 19:25:58.476438 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:25:58.476398 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.17:8643/healthz\": dial tcp 10.134.0.17:8643: connect: connection refused" Apr 28 19:26:00.714108 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:00.714068 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6149254-214c-471c-8ec1-826372d2bb1d" containerID="373ece8a05e79fe74181a8c8c18dd0cc78cbe0e2a599b086dbeaba18650ddcaa" exitCode=0 Apr 28 19:26:00.714498 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:00.714121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerDied","Data":"373ece8a05e79fe74181a8c8c18dd0cc78cbe0e2a599b086dbeaba18650ddcaa"} Apr 28 19:26:01.718273 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:01.718237 2572 generic.go:358] "Generic (PLEG): container finished" podID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerID="e519b5faa07ed45c094b9d609b81761e13acfb900fa7e749e31397f67245bffd" exitCode=0 Apr 28 19:26:01.718727 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:01.718314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" event={"ID":"da2c2459-c227-459c-ae3e-9783f5fa460b","Type":"ContainerDied","Data":"e519b5faa07ed45c094b9d609b81761e13acfb900fa7e749e31397f67245bffd"} Apr 28 19:26:01.719596 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:01.719572 2572 generic.go:358] "Generic (PLEG): container finished" podID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerID="6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638" exitCode=0 Apr 28 19:26:01.719671 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:01.719642 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" event={"ID":"c348e295-37a9-4e1d-a806-e255f953dbbe","Type":"ContainerDied","Data":"6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638"} Apr 28 19:26:02.725889 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:02.725851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" event={"ID":"da2c2459-c227-459c-ae3e-9783f5fa460b","Type":"ContainerStarted","Data":"23fc031193f5d923b5313afa5049c1a888b6c61bdf60bb958b7980ca43282702"} Apr 28 19:26:02.726320 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:02.725899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" event={"ID":"da2c2459-c227-459c-ae3e-9783f5fa460b","Type":"ContainerStarted","Data":"336b3339134acde5538a5344f35e287083fe34b371386af5704b9ca3ccc129cf"} Apr 28 19:26:02.726379 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:02.726325 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:26:02.726379 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:02.726358 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:26:02.727789 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:02.727755 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:26:02.752595 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:02.752411 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podStartSLOduration=6.752390308 podStartE2EDuration="6.752390308s" podCreationTimestamp="2026-04-28 19:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:26:02.750193977 +0000 UTC m=+604.422639080" watchObservedRunningTime="2026-04-28 19:26:02.752390308 +0000 UTC m=+604.424835413" Apr 28 19:26:03.477379 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:03.477331 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.17:8643/healthz\": dial tcp 10.134.0.17:8643: connect: connection refused" Apr 28 19:26:03.482338 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:03.482306 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:26:03.482724 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:03.482688 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:03.731455 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:03.731359 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:26:08.477026 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:08.476980 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.17:8643/healthz\": dial tcp 10.134.0.17:8643: connect: connection refused" Apr 28 19:26:08.477544 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:08.477113 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:26:08.736078 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:08.735989 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:26:08.736715 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:08.736679 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:26:13.476911 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:13.476822 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.17:8643/healthz\": dial tcp 10.134.0.17:8643: connect: connection refused" Apr 28 19:26:13.482450 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:13.482400 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:26:13.482903 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:13.482868 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:18.476995 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:18.476938 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.17:8643/healthz\": dial tcp 10.134.0.17:8643: connect: connection refused" Apr 28 19:26:18.737587 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:18.737485 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:26:21.788295 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:21.788263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" event={"ID":"c348e295-37a9-4e1d-a806-e255f953dbbe","Type":"ContainerStarted","Data":"ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275"} Apr 28 19:26:21.788671 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:21.788303 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" event={"ID":"c348e295-37a9-4e1d-a806-e255f953dbbe","Type":"ContainerStarted","Data":"96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae"} Apr 28 19:26:21.788671 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:21.788501 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:26:21.811880 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:21.811825 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podStartSLOduration=5.575951723 podStartE2EDuration="24.811810029s" podCreationTimestamp="2026-04-28 19:25:57 +0000 UTC" firstStartedPulling="2026-04-28 19:26:01.720751043 +0000 UTC m=+603.393196125" lastFinishedPulling="2026-04-28 19:26:20.956609348 +0000 UTC m=+622.629054431" observedRunningTime="2026-04-28 19:26:21.810214337 +0000 UTC m=+623.482659438" watchObservedRunningTime="2026-04-28 19:26:21.811810029 +0000 UTC m=+623.484255133" Apr 28 19:26:22.791299 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:22.791268 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:26:22.792563 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:22.792538 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:26:23.477000 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:23.476944 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.17:8643/healthz\": dial tcp 10.134.0.17:8643: connect: connection refused" Apr 28 19:26:23.482416 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:23.482382 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 28 19:26:23.482571 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:23.482534 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:26:23.482741 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:23.482715 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:26:23.482849 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:23.482834 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:26:23.794349 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:23.794247 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:26:25.801135 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:25.801103 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6149254-214c-471c-8ec1-826372d2bb1d" containerID="d46e87290243a5367960d88165a070c43ee67781d9dcb61d32cfcfeae8aa603c" exitCode=0 Apr 28 19:26:25.801513 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:25.801201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerDied","Data":"d46e87290243a5367960d88165a070c43ee67781d9dcb61d32cfcfeae8aa603c"} Apr 28 19:26:26.325275 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.325248 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:26:26.348787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.348758 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6149254-214c-471c-8ec1-826372d2bb1d-proxy-tls\") pod \"f6149254-214c-471c-8ec1-826372d2bb1d\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " Apr 28 19:26:26.348910 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.348796 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6149254-214c-471c-8ec1-826372d2bb1d-kserve-provision-location\") pod \"f6149254-214c-471c-8ec1-826372d2bb1d\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " Apr 28 19:26:26.348910 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.348832 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6149254-214c-471c-8ec1-826372d2bb1d-isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\") pod \"f6149254-214c-471c-8ec1-826372d2bb1d\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " Apr 28 19:26:26.348910 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.348853 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-255q4\" (UniqueName: \"kubernetes.io/projected/f6149254-214c-471c-8ec1-826372d2bb1d-kube-api-access-255q4\") pod \"f6149254-214c-471c-8ec1-826372d2bb1d\" (UID: \"f6149254-214c-471c-8ec1-826372d2bb1d\") " Apr 28 19:26:26.349253 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.349218 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6149254-214c-471c-8ec1-826372d2bb1d-isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config") pod "f6149254-214c-471c-8ec1-826372d2bb1d" (UID: "f6149254-214c-471c-8ec1-826372d2bb1d"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:26:26.349365 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.349226 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6149254-214c-471c-8ec1-826372d2bb1d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6149254-214c-471c-8ec1-826372d2bb1d" (UID: "f6149254-214c-471c-8ec1-826372d2bb1d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:26:26.351388 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.351362 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6149254-214c-471c-8ec1-826372d2bb1d-kube-api-access-255q4" (OuterVolumeSpecName: "kube-api-access-255q4") pod "f6149254-214c-471c-8ec1-826372d2bb1d" (UID: "f6149254-214c-471c-8ec1-826372d2bb1d"). InnerVolumeSpecName "kube-api-access-255q4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:26:26.351502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.351438 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6149254-214c-471c-8ec1-826372d2bb1d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f6149254-214c-471c-8ec1-826372d2bb1d" (UID: "f6149254-214c-471c-8ec1-826372d2bb1d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:26:26.449795 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.449752 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f6149254-214c-471c-8ec1-826372d2bb1d-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:26:26.449795 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.449793 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6149254-214c-471c-8ec1-826372d2bb1d-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:26:26.450089 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.449808 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f6149254-214c-471c-8ec1-826372d2bb1d-isvc-raw-sklearn-batcher-199b4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:26:26.450089 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.449823 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-255q4\" (UniqueName: \"kubernetes.io/projected/f6149254-214c-471c-8ec1-826372d2bb1d-kube-api-access-255q4\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:26:26.806859 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.806756 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" event={"ID":"f6149254-214c-471c-8ec1-826372d2bb1d","Type":"ContainerDied","Data":"7b3eb6a1eb478a6ec1d319e6fc23ce2ba6603c7664b0a0ef7f5967c1ac63d14b"} Apr 28 19:26:26.806859 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.806785 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn" Apr 28 19:26:26.806859 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.806806 2572 scope.go:117] "RemoveContainer" containerID="d46e87290243a5367960d88165a070c43ee67781d9dcb61d32cfcfeae8aa603c" Apr 28 19:26:26.816238 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.816210 2572 scope.go:117] "RemoveContainer" containerID="49b6c4318f1b4dc58a31b2766f76d5dfaf422be7b636389136d2799126143554" Apr 28 19:26:26.823925 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.823904 2572 scope.go:117] "RemoveContainer" containerID="373ece8a05e79fe74181a8c8c18dd0cc78cbe0e2a599b086dbeaba18650ddcaa" Apr 28 19:26:26.832057 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.832028 2572 scope.go:117] "RemoveContainer" containerID="582b217662a21d502da0e75dc8f0af89816fc296516e9c24bee42d23e656ed16" Apr 28 19:26:26.832456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.832435 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn"] Apr 28 19:26:26.840418 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.840370 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-199b4-predictor-7cdd66cc9f-t74pn"] Apr 28 19:26:26.889068 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:26.889029 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" path="/var/lib/kubelet/pods/f6149254-214c-471c-8ec1-826372d2bb1d/volumes" Apr 28 19:26:28.736954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:28.736908 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:26:28.798905 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:28.798874 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:26:28.799397 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:28.799371 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:26:38.736666 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:38.736622 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:26:38.822054 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:38.800225 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:26:48.737393 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:48.737353 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:26:48.799395 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:48.799353 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:26:58.736800 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:58.736756 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:26:58.799851 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:26:58.799809 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:27:08.737229 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:08.737158 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:27:08.799719 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:08.799668 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:27:18.737248 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:18.737215 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:27:18.799914 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:18.799879 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:27:28.800263 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:28.800232 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:27:46.165933 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.165897 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj"] Apr 28 19:27:46.166438 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.166270 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" containerID="cri-o://96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae" gracePeriod=30 Apr 28 19:27:46.166438 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.166295 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kube-rbac-proxy" containerID="cri-o://ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275" gracePeriod=30 Apr 28 19:27:46.873416 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873380 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf"] Apr 28 19:27:46.873706 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873695 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" Apr 28 19:27:46.873750 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873708 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" Apr 28 19:27:46.873786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873750 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="storage-initializer" Apr 28 19:27:46.873786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873756 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="storage-initializer" Apr 28 19:27:46.873786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873762 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" Apr 28 19:27:46.873786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873768 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" Apr 28 19:27:46.873786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873782 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" Apr 28 19:27:46.873786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873787 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" Apr 28 19:27:46.873961 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873835 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kube-rbac-proxy" Apr 28 19:27:46.873961 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873845 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="agent" Apr 28 19:27:46.873961 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.873851 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6149254-214c-471c-8ec1-826372d2bb1d" containerName="kserve-container" Apr 28 19:27:46.876893 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.876875 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.879346 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.879321 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\"" Apr 28 19:27:46.879471 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.879390 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-5628a-predictor-serving-cert\"" Apr 28 19:27:46.884496 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.884467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b4c02de-688d-4548-98fc-22809c409d68-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.884619 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.884508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4c02de-688d-4548-98fc-22809c409d68-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.884619 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.884553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b4c02de-688d-4548-98fc-22809c409d68-isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.884619 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.884596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt8wb\" (UniqueName: \"kubernetes.io/projected/6b4c02de-688d-4548-98fc-22809c409d68-kube-api-access-xt8wb\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.888958 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.888935 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf"] Apr 28 19:27:46.985238 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.985200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b4c02de-688d-4548-98fc-22809c409d68-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.985415 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.985239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4c02de-688d-4548-98fc-22809c409d68-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.985415 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.985288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b4c02de-688d-4548-98fc-22809c409d68-isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.985415 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.985330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt8wb\" (UniqueName: \"kubernetes.io/projected/6b4c02de-688d-4548-98fc-22809c409d68-kube-api-access-xt8wb\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.985794 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.985760 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4c02de-688d-4548-98fc-22809c409d68-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.986041 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.986020 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b4c02de-688d-4548-98fc-22809c409d68-isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.987757 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.987734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b4c02de-688d-4548-98fc-22809c409d68-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:46.995577 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:46.995553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt8wb\" (UniqueName: \"kubernetes.io/projected/6b4c02de-688d-4548-98fc-22809c409d68-kube-api-access-xt8wb\") pod \"isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:47.036800 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.036770 2572 generic.go:358] "Generic (PLEG): container finished" podID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerID="ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275" exitCode=2 Apr 28 19:27:47.036932 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.036848 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" event={"ID":"c348e295-37a9-4e1d-a806-e255f953dbbe","Type":"ContainerDied","Data":"ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275"} Apr 28 19:27:47.187851 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.187765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:47.271197 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.270911 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf"] Apr 28 19:27:47.272139 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.271349 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" containerID="cri-o://336b3339134acde5538a5344f35e287083fe34b371386af5704b9ca3ccc129cf" gracePeriod=30 Apr 28 19:27:47.272139 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.271717 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kube-rbac-proxy" containerID="cri-o://23fc031193f5d923b5313afa5049c1a888b6c61bdf60bb958b7980ca43282702" gracePeriod=30 Apr 28 19:27:47.322042 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.322005 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf"] Apr 28 19:27:47.325006 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:27:47.324970 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4c02de_688d_4548_98fc_22809c409d68.slice/crio-387840fcd6750644f64384ea9147f800c006c3740d6d40566446e6a8adf73469 WatchSource:0}: Error finding container 387840fcd6750644f64384ea9147f800c006c3740d6d40566446e6a8adf73469: Status 404 returned error can't find the container with id 387840fcd6750644f64384ea9147f800c006c3740d6d40566446e6a8adf73469 Apr 28 19:27:47.378894 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.378875 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:27:47.472698 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.472663 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t"] Apr 28 19:27:47.476673 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.476651 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.479586 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.479563 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\"" Apr 28 19:27:47.479700 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.479627 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-5628a-predictor-serving-cert\"" Apr 28 19:27:47.488577 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.488555 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t"] Apr 28 19:27:47.489387 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.489365 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c16c9c9-c169-4cd6-afea-4cd267985347-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.489449 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.489410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7cpp\" (UniqueName: \"kubernetes.io/projected/9c16c9c9-c169-4cd6-afea-4cd267985347-kube-api-access-z7cpp\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.489532 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.489514 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c16c9c9-c169-4cd6-afea-4cd267985347-isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.489582 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.489543 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c16c9c9-c169-4cd6-afea-4cd267985347-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.589929 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.589892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c16c9c9-c169-4cd6-afea-4cd267985347-isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.589929 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.589930 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c16c9c9-c169-4cd6-afea-4cd267985347-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.590209 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.589950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c16c9c9-c169-4cd6-afea-4cd267985347-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.590209 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.590071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7cpp\" (UniqueName: \"kubernetes.io/projected/9c16c9c9-c169-4cd6-afea-4cd267985347-kube-api-access-z7cpp\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.590449 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.590418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c16c9c9-c169-4cd6-afea-4cd267985347-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.590834 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.590808 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c16c9c9-c169-4cd6-afea-4cd267985347-isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.592714 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.592693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c16c9c9-c169-4cd6-afea-4cd267985347-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.598079 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.598050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7cpp\" (UniqueName: \"kubernetes.io/projected/9c16c9c9-c169-4cd6-afea-4cd267985347-kube-api-access-z7cpp\") pod \"isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.787249 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.787131 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:47.910257 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:47.910216 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t"] Apr 28 19:27:47.914377 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:27:47.914346 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c16c9c9_c169_4cd6_afea_4cd267985347.slice/crio-aa237adc49a6913555250d0c40515e8dea9cb2aaa5e48d906f9fc97d1b385c7c WatchSource:0}: Error finding container aa237adc49a6913555250d0c40515e8dea9cb2aaa5e48d906f9fc97d1b385c7c: Status 404 returned error can't find the container with id aa237adc49a6913555250d0c40515e8dea9cb2aaa5e48d906f9fc97d1b385c7c Apr 28 19:27:48.041454 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.041363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" event={"ID":"9c16c9c9-c169-4cd6-afea-4cd267985347","Type":"ContainerStarted","Data":"6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a"} Apr 28 19:27:48.041454 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.041407 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" event={"ID":"9c16c9c9-c169-4cd6-afea-4cd267985347","Type":"ContainerStarted","Data":"aa237adc49a6913555250d0c40515e8dea9cb2aaa5e48d906f9fc97d1b385c7c"} Apr 28 19:27:48.043401 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.043370 2572 generic.go:358] "Generic (PLEG): container finished" podID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerID="23fc031193f5d923b5313afa5049c1a888b6c61bdf60bb958b7980ca43282702" exitCode=2 Apr 28 19:27:48.043518 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.043438 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" event={"ID":"da2c2459-c227-459c-ae3e-9783f5fa460b","Type":"ContainerDied","Data":"23fc031193f5d923b5313afa5049c1a888b6c61bdf60bb958b7980ca43282702"} Apr 28 19:27:48.044822 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.044798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" event={"ID":"6b4c02de-688d-4548-98fc-22809c409d68","Type":"ContainerStarted","Data":"a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3"} Apr 28 19:27:48.044946 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.044827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" event={"ID":"6b4c02de-688d-4548-98fc-22809c409d68","Type":"ContainerStarted","Data":"387840fcd6750644f64384ea9147f800c006c3740d6d40566446e6a8adf73469"} Apr 28 19:27:48.731960 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.731917 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.18:8643/healthz\": dial tcp 10.134.0.18:8643: connect: connection refused" Apr 28 19:27:48.736585 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.736549 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 28 19:27:48.795530 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.795485 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.19:8643/healthz\": dial tcp 10.134.0.19:8643: connect: connection refused" Apr 28 19:27:48.799820 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:48.799794 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.19:8080: connect: connection refused" Apr 28 19:27:50.310031 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.310006 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:27:50.415617 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.415528 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c348e295-37a9-4e1d-a806-e255f953dbbe-proxy-tls\") pod \"c348e295-37a9-4e1d-a806-e255f953dbbe\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " Apr 28 19:27:50.415617 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.415573 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c348e295-37a9-4e1d-a806-e255f953dbbe-isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\") pod \"c348e295-37a9-4e1d-a806-e255f953dbbe\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " Apr 28 19:27:50.415617 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.415615 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c348e295-37a9-4e1d-a806-e255f953dbbe-kserve-provision-location\") pod \"c348e295-37a9-4e1d-a806-e255f953dbbe\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " Apr 28 19:27:50.415841 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.415668 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlt2x\" (UniqueName: \"kubernetes.io/projected/c348e295-37a9-4e1d-a806-e255f953dbbe-kube-api-access-mlt2x\") pod \"c348e295-37a9-4e1d-a806-e255f953dbbe\" (UID: \"c348e295-37a9-4e1d-a806-e255f953dbbe\") " Apr 28 19:27:50.415980 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.415956 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c348e295-37a9-4e1d-a806-e255f953dbbe-isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config") pod "c348e295-37a9-4e1d-a806-e255f953dbbe" (UID: "c348e295-37a9-4e1d-a806-e255f953dbbe"). InnerVolumeSpecName "isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:27:50.416024 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.415955 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c348e295-37a9-4e1d-a806-e255f953dbbe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c348e295-37a9-4e1d-a806-e255f953dbbe" (UID: "c348e295-37a9-4e1d-a806-e255f953dbbe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:27:50.417798 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.417776 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c348e295-37a9-4e1d-a806-e255f953dbbe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c348e295-37a9-4e1d-a806-e255f953dbbe" (UID: "c348e295-37a9-4e1d-a806-e255f953dbbe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:27:50.417874 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.417857 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c348e295-37a9-4e1d-a806-e255f953dbbe-kube-api-access-mlt2x" (OuterVolumeSpecName: "kube-api-access-mlt2x") pod "c348e295-37a9-4e1d-a806-e255f953dbbe" (UID: "c348e295-37a9-4e1d-a806-e255f953dbbe"). InnerVolumeSpecName "kube-api-access-mlt2x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:27:50.516432 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.516381 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c348e295-37a9-4e1d-a806-e255f953dbbe-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:27:50.516432 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.516426 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c348e295-37a9-4e1d-a806-e255f953dbbe-isvc-xgboost-graph-raw-357ae-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:27:50.516432 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.516437 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c348e295-37a9-4e1d-a806-e255f953dbbe-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:27:50.516659 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:50.516448 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlt2x\" (UniqueName: \"kubernetes.io/projected/c348e295-37a9-4e1d-a806-e255f953dbbe-kube-api-access-mlt2x\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:27:51.055565 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.055531 2572 generic.go:358] "Generic (PLEG): container finished" podID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerID="96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae" exitCode=0 Apr 28 19:27:51.055704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.055609 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" event={"ID":"c348e295-37a9-4e1d-a806-e255f953dbbe","Type":"ContainerDied","Data":"96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae"} Apr 28 19:27:51.055704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.055652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" event={"ID":"c348e295-37a9-4e1d-a806-e255f953dbbe","Type":"ContainerDied","Data":"8bb13f8b9f6e6d951e5884b9c023102612fec0045bb27e6391ec81e1d337bef2"} Apr 28 19:27:51.055704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.055664 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj" Apr 28 19:27:51.055704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.055673 2572 scope.go:117] "RemoveContainer" containerID="ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275" Apr 28 19:27:51.056932 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.056910 2572 generic.go:358] "Generic (PLEG): container finished" podID="6b4c02de-688d-4548-98fc-22809c409d68" containerID="a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3" exitCode=0 Apr 28 19:27:51.057048 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.056979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" event={"ID":"6b4c02de-688d-4548-98fc-22809c409d68","Type":"ContainerDied","Data":"a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3"} Apr 28 19:27:51.090234 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.090214 2572 scope.go:117] "RemoveContainer" containerID="96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae" Apr 28 19:27:51.111973 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.111952 2572 scope.go:117] "RemoveContainer" containerID="6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638" Apr 28 19:27:51.123782 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.123740 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj"] Apr 28 19:27:51.127347 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.127320 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-357ae-predictor-f454f465-2dszj"] Apr 28 19:27:51.130127 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.130109 2572 scope.go:117] "RemoveContainer" containerID="ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275" Apr 28 19:27:51.130488 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:27:51.130464 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275\": container with ID starting with ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275 not found: ID does not exist" containerID="ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275" Apr 28 19:27:51.130571 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.130496 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275"} err="failed to get container status \"ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275\": rpc error: code = NotFound desc = could not find container \"ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275\": container with ID starting with ef6a5bdd437877d61bd856af0f18f7bec9b27bf3cfaed9b2da99fbd114ce1275 not found: ID does not exist" Apr 28 19:27:51.130571 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.130526 2572 scope.go:117] "RemoveContainer" containerID="96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae" Apr 28 19:27:51.130779 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:27:51.130757 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae\": container with ID starting with 96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae not found: ID does not exist" containerID="96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae" Apr 28 19:27:51.130838 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.130789 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae"} err="failed to get container status \"96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae\": rpc error: code = NotFound desc = could not find container \"96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae\": container with ID starting with 96763129595c614e3f8783552e78303e47859043977de4fd0ab7940a1195a5ae not found: ID does not exist" Apr 28 19:27:51.130838 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.130814 2572 scope.go:117] "RemoveContainer" containerID="6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638" Apr 28 19:27:51.131122 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:27:51.131103 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638\": container with ID starting with 6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638 not found: ID does not exist" containerID="6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638" Apr 28 19:27:51.131162 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:51.131127 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638"} err="failed to get container status \"6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638\": rpc error: code = NotFound desc = could not find container \"6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638\": container with ID starting with 6475de39a224cea235055988be721e9119717c60b872719dd1dc368311287638 not found: ID does not exist" Apr 28 19:27:52.063363 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.063323 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" event={"ID":"6b4c02de-688d-4548-98fc-22809c409d68","Type":"ContainerStarted","Data":"e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298"} Apr 28 19:27:52.063780 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.063402 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" event={"ID":"6b4c02de-688d-4548-98fc-22809c409d68","Type":"ContainerStarted","Data":"6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5"} Apr 28 19:27:52.063780 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.063671 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:52.063780 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.063702 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:52.065083 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.065049 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:27:52.065239 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.065094 2572 generic.go:358] "Generic (PLEG): container finished" podID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerID="6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a" exitCode=0 Apr 28 19:27:52.065239 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.065163 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" event={"ID":"9c16c9c9-c169-4cd6-afea-4cd267985347","Type":"ContainerDied","Data":"6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a"} Apr 28 19:27:52.067445 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.067421 2572 generic.go:358] "Generic (PLEG): container finished" podID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerID="336b3339134acde5538a5344f35e287083fe34b371386af5704b9ca3ccc129cf" exitCode=0 Apr 28 19:27:52.067554 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.067439 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" event={"ID":"da2c2459-c227-459c-ae3e-9783f5fa460b","Type":"ContainerDied","Data":"336b3339134acde5538a5344f35e287083fe34b371386af5704b9ca3ccc129cf"} Apr 28 19:27:52.081760 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.081714 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podStartSLOduration=6.081697424 podStartE2EDuration="6.081697424s" podCreationTimestamp="2026-04-28 19:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:27:52.08106656 +0000 UTC m=+713.753511664" watchObservedRunningTime="2026-04-28 19:27:52.081697424 +0000 UTC m=+713.754142529" Apr 28 19:27:52.134056 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.134033 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:27:52.231426 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.231368 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da2c2459-c227-459c-ae3e-9783f5fa460b-proxy-tls\") pod \"da2c2459-c227-459c-ae3e-9783f5fa460b\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " Apr 28 19:27:52.231426 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.231422 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da2c2459-c227-459c-ae3e-9783f5fa460b-isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\") pod \"da2c2459-c227-459c-ae3e-9783f5fa460b\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " Apr 28 19:27:52.231555 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.231472 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da2c2459-c227-459c-ae3e-9783f5fa460b-kserve-provision-location\") pod \"da2c2459-c227-459c-ae3e-9783f5fa460b\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " Apr 28 19:27:52.231555 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.231491 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtffl\" (UniqueName: \"kubernetes.io/projected/da2c2459-c227-459c-ae3e-9783f5fa460b-kube-api-access-vtffl\") pod \"da2c2459-c227-459c-ae3e-9783f5fa460b\" (UID: \"da2c2459-c227-459c-ae3e-9783f5fa460b\") " Apr 28 19:27:52.231777 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.231755 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2c2459-c227-459c-ae3e-9783f5fa460b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da2c2459-c227-459c-ae3e-9783f5fa460b" (UID: "da2c2459-c227-459c-ae3e-9783f5fa460b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:27:52.231850 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.231769 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2c2459-c227-459c-ae3e-9783f5fa460b-isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config") pod "da2c2459-c227-459c-ae3e-9783f5fa460b" (UID: "da2c2459-c227-459c-ae3e-9783f5fa460b"). InnerVolumeSpecName "isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:27:52.233557 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.233524 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2c2459-c227-459c-ae3e-9783f5fa460b-kube-api-access-vtffl" (OuterVolumeSpecName: "kube-api-access-vtffl") pod "da2c2459-c227-459c-ae3e-9783f5fa460b" (UID: "da2c2459-c227-459c-ae3e-9783f5fa460b"). InnerVolumeSpecName "kube-api-access-vtffl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:27:52.233557 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.233550 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2c2459-c227-459c-ae3e-9783f5fa460b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "da2c2459-c227-459c-ae3e-9783f5fa460b" (UID: "da2c2459-c227-459c-ae3e-9783f5fa460b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:27:52.332154 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.332111 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da2c2459-c227-459c-ae3e-9783f5fa460b-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:27:52.332154 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.332147 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da2c2459-c227-459c-ae3e-9783f5fa460b-isvc-sklearn-graph-raw-357ae-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:27:52.332154 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.332158 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da2c2459-c227-459c-ae3e-9783f5fa460b-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:27:52.332420 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.332168 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtffl\" (UniqueName: \"kubernetes.io/projected/da2c2459-c227-459c-ae3e-9783f5fa460b-kube-api-access-vtffl\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:27:52.890497 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:52.890462 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" path="/var/lib/kubelet/pods/c348e295-37a9-4e1d-a806-e255f953dbbe/volumes" Apr 28 19:27:53.073851 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.073812 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" event={"ID":"9c16c9c9-c169-4cd6-afea-4cd267985347","Type":"ContainerStarted","Data":"04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c"} Apr 28 19:27:53.074340 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.073863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" event={"ID":"9c16c9c9-c169-4cd6-afea-4cd267985347","Type":"ContainerStarted","Data":"784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2"} Apr 28 19:27:53.074340 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.074204 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:53.074447 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.074350 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:53.075595 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.075564 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:27:53.075781 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.075763 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" Apr 28 19:27:53.075781 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.075765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf" event={"ID":"da2c2459-c227-459c-ae3e-9783f5fa460b","Type":"ContainerDied","Data":"e454eadc4dd3a20ae1769a6dedb267d4d2fa3702ec3792929d82fead87f43ebe"} Apr 28 19:27:53.075906 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.075803 2572 scope.go:117] "RemoveContainer" containerID="23fc031193f5d923b5313afa5049c1a888b6c61bdf60bb958b7980ca43282702" Apr 28 19:27:53.076038 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.076000 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:27:53.084800 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.084779 2572 scope.go:117] "RemoveContainer" containerID="336b3339134acde5538a5344f35e287083fe34b371386af5704b9ca3ccc129cf" Apr 28 19:27:53.092223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.092041 2572 scope.go:117] "RemoveContainer" containerID="e519b5faa07ed45c094b9d609b81761e13acfb900fa7e749e31397f67245bffd" Apr 28 19:27:53.092686 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.092647 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podStartSLOduration=6.092635557 podStartE2EDuration="6.092635557s" podCreationTimestamp="2026-04-28 19:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:27:53.091326165 +0000 UTC m=+714.763771279" watchObservedRunningTime="2026-04-28 19:27:53.092635557 +0000 UTC m=+714.765080661" Apr 28 19:27:53.104068 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.104045 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf"] Apr 28 19:27:53.107631 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:53.107612 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-357ae-predictor-56864bdff5-9mmsf"] Apr 28 19:27:54.079704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:54.079662 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:27:54.889324 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:54.889287 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" path="/var/lib/kubelet/pods/da2c2459-c227-459c-ae3e-9783f5fa460b/volumes" Apr 28 19:27:58.080855 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:58.080824 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:27:58.081410 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:58.081383 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:27:59.084562 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:59.084485 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:27:59.085017 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:27:59.084979 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:28:08.081753 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:08.081711 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:28:09.085519 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:09.085472 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:28:18.082273 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:18.082226 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:28:19.085438 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:19.085394 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:28:28.081330 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:28.081282 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:28:29.085099 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:29.085048 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:28:38.081434 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:38.081388 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:28:39.085412 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:39.085370 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:28:48.081428 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:48.081384 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:28:49.085871 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:49.085831 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:28:58.081908 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:58.081869 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 28 19:28:59.085335 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:28:59.085305 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:29:01.886372 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:01.886330 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:29:26.515125 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.515031 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t"] Apr 28 19:29:26.515619 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.515375 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" containerID="cri-o://784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2" gracePeriod=30 Apr 28 19:29:26.515619 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.515404 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kube-rbac-proxy" containerID="cri-o://04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c" gracePeriod=30 Apr 28 19:29:26.960003 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.959966 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf"] Apr 28 19:29:26.960315 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.960287 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" containerID="cri-o://6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5" gracePeriod=30 Apr 28 19:29:26.960891 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.960356 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kube-rbac-proxy" containerID="cri-o://e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298" gracePeriod=30 Apr 28 19:29:26.993167 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993131 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq"] Apr 28 19:29:26.993502 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993485 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="storage-initializer" Apr 28 19:29:26.993584 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993505 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="storage-initializer" Apr 28 19:29:26.993584 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993517 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kube-rbac-proxy" Apr 28 19:29:26.993584 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993525 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kube-rbac-proxy" Apr 28 19:29:26.993584 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993557 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="storage-initializer" Apr 28 19:29:26.993584 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993566 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="storage-initializer" Apr 28 19:29:26.993584 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993575 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kube-rbac-proxy" Apr 28 19:29:26.993584 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993583 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kube-rbac-proxy" Apr 28 19:29:26.993954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993593 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" Apr 28 19:29:26.993954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993602 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" Apr 28 19:29:26.993954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993618 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" Apr 28 19:29:26.993954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993626 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" Apr 28 19:29:26.993954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993692 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kserve-container" Apr 28 19:29:26.993954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993704 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="da2c2459-c227-459c-ae3e-9783f5fa460b" containerName="kube-rbac-proxy" Apr 28 19:29:26.993954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993715 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kube-rbac-proxy" Apr 28 19:29:26.993954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.993723 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c348e295-37a9-4e1d-a806-e255f953dbbe" containerName="kserve-container" Apr 28 19:29:26.996812 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.996793 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:26.999494 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.999472 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-a8bff-predictor-serving-cert\"" Apr 28 19:29:26.999681 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:26.999664 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\"" Apr 28 19:29:27.006436 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.006414 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq"] Apr 28 19:29:27.071923 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.071878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c323af27-9a6e-4bba-86e5-6e631c80460b-message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.072073 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.071934 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.072073 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.072029 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6krr\" (UniqueName: \"kubernetes.io/projected/c323af27-9a6e-4bba-86e5-6e631c80460b-kube-api-access-j6krr\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.172627 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.172593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6krr\" (UniqueName: \"kubernetes.io/projected/c323af27-9a6e-4bba-86e5-6e631c80460b-kube-api-access-j6krr\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.172817 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.172650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c323af27-9a6e-4bba-86e5-6e631c80460b-message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.172817 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.172681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.172817 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:27.172783 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-serving-cert: secret "message-dumper-raw-a8bff-predictor-serving-cert" not found Apr 28 19:29:27.173001 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:27.172844 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls podName:c323af27-9a6e-4bba-86e5-6e631c80460b nodeName:}" failed. No retries permitted until 2026-04-28 19:29:27.672823644 +0000 UTC m=+809.345268732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls") pod "message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" (UID: "c323af27-9a6e-4bba-86e5-6e631c80460b") : secret "message-dumper-raw-a8bff-predictor-serving-cert" not found Apr 28 19:29:27.173380 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.173355 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c323af27-9a6e-4bba-86e5-6e631c80460b-message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.182091 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.182062 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6krr\" (UniqueName: \"kubernetes.io/projected/c323af27-9a6e-4bba-86e5-6e631c80460b-kube-api-access-j6krr\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.352679 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.352593 2572 generic.go:358] "Generic (PLEG): container finished" podID="6b4c02de-688d-4548-98fc-22809c409d68" containerID="e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298" exitCode=2 Apr 28 19:29:27.352679 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.352666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" event={"ID":"6b4c02de-688d-4548-98fc-22809c409d68","Type":"ContainerDied","Data":"e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298"} Apr 28 19:29:27.354392 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.354368 2572 generic.go:358] "Generic (PLEG): container finished" podID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerID="04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c" exitCode=2 Apr 28 19:29:27.354514 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.354405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" event={"ID":"9c16c9c9-c169-4cd6-afea-4cd267985347","Type":"ContainerDied","Data":"04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c"} Apr 28 19:29:27.676343 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:27.676250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:27.676694 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:27.676402 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-serving-cert: secret "message-dumper-raw-a8bff-predictor-serving-cert" not found Apr 28 19:29:27.676694 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:27.676469 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls podName:c323af27-9a6e-4bba-86e5-6e631c80460b nodeName:}" failed. No retries permitted until 2026-04-28 19:29:28.676453519 +0000 UTC m=+810.348898604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls") pod "message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" (UID: "c323af27-9a6e-4bba-86e5-6e631c80460b") : secret "message-dumper-raw-a8bff-predictor-serving-cert" not found Apr 28 19:29:28.076594 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:28.076554 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.20:8643/healthz\": dial tcp 10.134.0.20:8643: connect: connection refused" Apr 28 19:29:28.684535 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:28.684497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:28.687024 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:28.686998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls\") pod \"message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:28.807620 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:28.807583 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:28.929591 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:28.929444 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq"] Apr 28 19:29:28.932207 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:29:28.932148 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc323af27_9a6e_4bba_86e5_6e631c80460b.slice/crio-46313cf29e5ab7f67d8817ec134074c8adc7f8f285e9b0d09aee2d763402a934 WatchSource:0}: Error finding container 46313cf29e5ab7f67d8817ec134074c8adc7f8f285e9b0d09aee2d763402a934: Status 404 returned error can't find the container with id 46313cf29e5ab7f67d8817ec134074c8adc7f8f285e9b0d09aee2d763402a934 Apr 28 19:29:29.080228 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:29.080152 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.21:8643/healthz\": dial tcp 10.134.0.21:8643: connect: connection refused" Apr 28 19:29:29.085583 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:29.085555 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 28 19:29:29.362096 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:29.362055 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" event={"ID":"c323af27-9a6e-4bba-86e5-6e631c80460b","Type":"ContainerStarted","Data":"46313cf29e5ab7f67d8817ec134074c8adc7f8f285e9b0d09aee2d763402a934"} Apr 28 19:29:30.265117 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.265091 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:29:30.297281 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.297249 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c16c9c9-c169-4cd6-afea-4cd267985347-kserve-provision-location\") pod \"9c16c9c9-c169-4cd6-afea-4cd267985347\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " Apr 28 19:29:30.297588 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.297335 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7cpp\" (UniqueName: \"kubernetes.io/projected/9c16c9c9-c169-4cd6-afea-4cd267985347-kube-api-access-z7cpp\") pod \"9c16c9c9-c169-4cd6-afea-4cd267985347\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " Apr 28 19:29:30.297588 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.297372 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c16c9c9-c169-4cd6-afea-4cd267985347-isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") pod \"9c16c9c9-c169-4cd6-afea-4cd267985347\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " Apr 28 19:29:30.297588 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.297406 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c16c9c9-c169-4cd6-afea-4cd267985347-proxy-tls\") pod \"9c16c9c9-c169-4cd6-afea-4cd267985347\" (UID: \"9c16c9c9-c169-4cd6-afea-4cd267985347\") " Apr 28 19:29:30.297772 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.297637 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c16c9c9-c169-4cd6-afea-4cd267985347-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9c16c9c9-c169-4cd6-afea-4cd267985347" (UID: "9c16c9c9-c169-4cd6-afea-4cd267985347"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:29:30.297830 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.297783 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c16c9c9-c169-4cd6-afea-4cd267985347-isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config") pod "9c16c9c9-c169-4cd6-afea-4cd267985347" (UID: "9c16c9c9-c169-4cd6-afea-4cd267985347"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:29:30.299621 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.299592 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c16c9c9-c169-4cd6-afea-4cd267985347-kube-api-access-z7cpp" (OuterVolumeSpecName: "kube-api-access-z7cpp") pod "9c16c9c9-c169-4cd6-afea-4cd267985347" (UID: "9c16c9c9-c169-4cd6-afea-4cd267985347"). InnerVolumeSpecName "kube-api-access-z7cpp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:29:30.299754 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.299661 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c16c9c9-c169-4cd6-afea-4cd267985347-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9c16c9c9-c169-4cd6-afea-4cd267985347" (UID: "9c16c9c9-c169-4cd6-afea-4cd267985347"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:29:30.367446 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.367415 2572 generic.go:358] "Generic (PLEG): container finished" podID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerID="784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2" exitCode=0 Apr 28 19:29:30.367596 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.367502 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" Apr 28 19:29:30.367596 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.367509 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" event={"ID":"9c16c9c9-c169-4cd6-afea-4cd267985347","Type":"ContainerDied","Data":"784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2"} Apr 28 19:29:30.367596 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.367558 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t" event={"ID":"9c16c9c9-c169-4cd6-afea-4cd267985347","Type":"ContainerDied","Data":"aa237adc49a6913555250d0c40515e8dea9cb2aaa5e48d906f9fc97d1b385c7c"} Apr 28 19:29:30.367596 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.367578 2572 scope.go:117] "RemoveContainer" containerID="04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c" Apr 28 19:29:30.369574 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.369545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" event={"ID":"c323af27-9a6e-4bba-86e5-6e631c80460b","Type":"ContainerStarted","Data":"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855"} Apr 28 19:29:30.369703 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.369583 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" event={"ID":"c323af27-9a6e-4bba-86e5-6e631c80460b","Type":"ContainerStarted","Data":"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753"} Apr 28 19:29:30.369759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.369714 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:30.376639 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.376365 2572 scope.go:117] "RemoveContainer" containerID="784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2" Apr 28 19:29:30.383417 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.383396 2572 scope.go:117] "RemoveContainer" containerID="6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a" Apr 28 19:29:30.393367 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.393321 2572 scope.go:117] "RemoveContainer" containerID="04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c" Apr 28 19:29:30.393908 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:30.393704 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c\": container with ID starting with 04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c not found: ID does not exist" containerID="04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c" Apr 28 19:29:30.393998 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.393924 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c"} err="failed to get container status \"04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c\": rpc error: code = NotFound desc = could not find container \"04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c\": container with ID starting with 04d514cdd278349fb2521401e159475006d9e122bb22b6683b589835e4992d8c not found: ID does not exist" Apr 28 19:29:30.393998 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.393952 2572 scope.go:117] "RemoveContainer" containerID="784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2" Apr 28 19:29:30.394323 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:30.394301 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2\": container with ID starting with 784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2 not found: ID does not exist" containerID="784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2" Apr 28 19:29:30.394425 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.394329 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2"} err="failed to get container status \"784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2\": rpc error: code = NotFound desc = could not find container \"784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2\": container with ID starting with 784e06e005480dc785e5c63916e626303aaa2152c817d01f244257180db289c2 not found: ID does not exist" Apr 28 19:29:30.394425 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.394372 2572 scope.go:117] "RemoveContainer" containerID="6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a" Apr 28 19:29:30.394690 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:30.394660 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a\": container with ID starting with 6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a not found: ID does not exist" containerID="6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a" Apr 28 19:29:30.394788 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.394696 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a"} err="failed to get container status \"6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a\": rpc error: code = NotFound desc = could not find container \"6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a\": container with ID starting with 6615d637a5c09fac201afeaeff9902f5f584fd2bac6cf4110d57966a97fb7b8a not found: ID does not exist" Apr 28 19:29:30.395481 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.395444 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" podStartSLOduration=3.336955214 podStartE2EDuration="4.395433309s" podCreationTimestamp="2026-04-28 19:29:26 +0000 UTC" firstStartedPulling="2026-04-28 19:29:28.93391005 +0000 UTC m=+810.606355131" lastFinishedPulling="2026-04-28 19:29:29.992388139 +0000 UTC m=+811.664833226" observedRunningTime="2026-04-28 19:29:30.393903443 +0000 UTC m=+812.066348548" watchObservedRunningTime="2026-04-28 19:29:30.395433309 +0000 UTC m=+812.067878412" Apr 28 19:29:30.398379 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.398361 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c16c9c9-c169-4cd6-afea-4cd267985347-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:29:30.398466 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.398381 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z7cpp\" (UniqueName: \"kubernetes.io/projected/9c16c9c9-c169-4cd6-afea-4cd267985347-kube-api-access-z7cpp\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:29:30.398466 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.398392 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c16c9c9-c169-4cd6-afea-4cd267985347-isvc-xgboost-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:29:30.398466 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.398401 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c16c9c9-c169-4cd6-afea-4cd267985347-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:29:30.406382 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.406353 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t"] Apr 28 19:29:30.409696 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.409674 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-5628a-predictor-6fcd96cc69-66x2t"] Apr 28 19:29:30.888419 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:30.888386 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" path="/var/lib/kubelet/pods/9c16c9c9-c169-4cd6-afea-4cd267985347/volumes" Apr 28 19:29:31.373299 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.373274 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:31.374964 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.374938 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:31.602053 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.602031 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:29:31.708629 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.708601 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4c02de-688d-4548-98fc-22809c409d68-kserve-provision-location\") pod \"6b4c02de-688d-4548-98fc-22809c409d68\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " Apr 28 19:29:31.708764 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.708647 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt8wb\" (UniqueName: \"kubernetes.io/projected/6b4c02de-688d-4548-98fc-22809c409d68-kube-api-access-xt8wb\") pod \"6b4c02de-688d-4548-98fc-22809c409d68\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " Apr 28 19:29:31.708764 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.708670 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b4c02de-688d-4548-98fc-22809c409d68-proxy-tls\") pod \"6b4c02de-688d-4548-98fc-22809c409d68\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " Apr 28 19:29:31.708764 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.708706 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b4c02de-688d-4548-98fc-22809c409d68-isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") pod \"6b4c02de-688d-4548-98fc-22809c409d68\" (UID: \"6b4c02de-688d-4548-98fc-22809c409d68\") " Apr 28 19:29:31.709003 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.708977 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4c02de-688d-4548-98fc-22809c409d68-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6b4c02de-688d-4548-98fc-22809c409d68" (UID: "6b4c02de-688d-4548-98fc-22809c409d68"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:29:31.709167 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.709138 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4c02de-688d-4548-98fc-22809c409d68-isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config") pod "6b4c02de-688d-4548-98fc-22809c409d68" (UID: "6b4c02de-688d-4548-98fc-22809c409d68"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:29:31.710693 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.710666 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4c02de-688d-4548-98fc-22809c409d68-kube-api-access-xt8wb" (OuterVolumeSpecName: "kube-api-access-xt8wb") pod "6b4c02de-688d-4548-98fc-22809c409d68" (UID: "6b4c02de-688d-4548-98fc-22809c409d68"). InnerVolumeSpecName "kube-api-access-xt8wb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:29:31.710693 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.710686 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4c02de-688d-4548-98fc-22809c409d68-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6b4c02de-688d-4548-98fc-22809c409d68" (UID: "6b4c02de-688d-4548-98fc-22809c409d68"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:29:31.809784 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.809730 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b4c02de-688d-4548-98fc-22809c409d68-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:29:31.809784 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.809776 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b4c02de-688d-4548-98fc-22809c409d68-isvc-sklearn-graph-raw-hpa-5628a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:29:31.809784 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.809787 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b4c02de-688d-4548-98fc-22809c409d68-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:29:31.809784 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:31.809798 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xt8wb\" (UniqueName: \"kubernetes.io/projected/6b4c02de-688d-4548-98fc-22809c409d68-kube-api-access-xt8wb\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:29:32.377373 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.377339 2572 generic.go:358] "Generic (PLEG): container finished" podID="6b4c02de-688d-4548-98fc-22809c409d68" containerID="6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5" exitCode=0 Apr 28 19:29:32.377766 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.377429 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" Apr 28 19:29:32.377766 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.377426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" event={"ID":"6b4c02de-688d-4548-98fc-22809c409d68","Type":"ContainerDied","Data":"6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5"} Apr 28 19:29:32.377766 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.377467 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf" event={"ID":"6b4c02de-688d-4548-98fc-22809c409d68","Type":"ContainerDied","Data":"387840fcd6750644f64384ea9147f800c006c3740d6d40566446e6a8adf73469"} Apr 28 19:29:32.377766 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.377484 2572 scope.go:117] "RemoveContainer" containerID="e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298" Apr 28 19:29:32.385448 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.385432 2572 scope.go:117] "RemoveContainer" containerID="6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5" Apr 28 19:29:32.392713 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.392695 2572 scope.go:117] "RemoveContainer" containerID="a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3" Apr 28 19:29:32.399270 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.399247 2572 scope.go:117] "RemoveContainer" containerID="e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298" Apr 28 19:29:32.399558 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:32.399533 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298\": container with ID starting with e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298 not found: ID does not exist" containerID="e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298" Apr 28 19:29:32.399661 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.399564 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298"} err="failed to get container status \"e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298\": rpc error: code = NotFound desc = could not find container \"e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298\": container with ID starting with e6c7dd2699b870d8cfa0ada9ec7f0e8c76a66414567020a97500531331fd0298 not found: ID does not exist" Apr 28 19:29:32.399661 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.399591 2572 scope.go:117] "RemoveContainer" containerID="6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5" Apr 28 19:29:32.399873 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:32.399854 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5\": container with ID starting with 6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5 not found: ID does not exist" containerID="6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5" Apr 28 19:29:32.399929 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.399880 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5"} err="failed to get container status \"6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5\": rpc error: code = NotFound desc = could not find container \"6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5\": container with ID starting with 6466237f66839e86d380179e7ad503a4b5751dc745bd9137411cdd6a07af08f5 not found: ID does not exist" Apr 28 19:29:32.399929 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.399897 2572 scope.go:117] "RemoveContainer" containerID="a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3" Apr 28 19:29:32.400481 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:29:32.400273 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3\": container with ID starting with a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3 not found: ID does not exist" containerID="a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3" Apr 28 19:29:32.400481 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.400319 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3"} err="failed to get container status \"a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3\": rpc error: code = NotFound desc = could not find container \"a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3\": container with ID starting with a71decaefc99947a416f486bbec016fe62f85bced25ff423222a6165c78fd6a3 not found: ID does not exist" Apr 28 19:29:32.402130 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.402108 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf"] Apr 28 19:29:32.406704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.406682 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-5628a-predictor-c488b8fff-t6hnf"] Apr 28 19:29:32.889259 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:32.889227 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4c02de-688d-4548-98fc-22809c409d68" path="/var/lib/kubelet/pods/6b4c02de-688d-4548-98fc-22809c409d68/volumes" Apr 28 19:29:38.386030 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:38.386000 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:29:47.174405 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174367 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp"] Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174661 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="storage-initializer" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174672 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="storage-initializer" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174681 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174688 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174699 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174705 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174714 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="storage-initializer" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174720 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="storage-initializer" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174730 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kube-rbac-proxy" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174735 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kube-rbac-proxy" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174742 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kube-rbac-proxy" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174747 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kube-rbac-proxy" Apr 28 19:29:47.174786 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174787 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kserve-container" Apr 28 19:29:47.175144 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174794 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c16c9c9-c169-4cd6-afea-4cd267985347" containerName="kube-rbac-proxy" Apr 28 19:29:47.175144 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174802 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kube-rbac-proxy" Apr 28 19:29:47.175144 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.174808 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b4c02de-688d-4548-98fc-22809c409d68" containerName="kserve-container" Apr 28 19:29:47.177842 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.177826 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.180452 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.180427 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\"" Apr 28 19:29:47.180584 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.180469 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-a8bff-predictor-serving-cert\"" Apr 28 19:29:47.188914 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.188890 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp"] Apr 28 19:29:47.235073 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.235039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0926c6f8-5cea-4436-a832-15b9b25b565c-proxy-tls\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.235281 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.235085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vllhm\" (UniqueName: \"kubernetes.io/projected/0926c6f8-5cea-4436-a832-15b9b25b565c-kube-api-access-vllhm\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.235281 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.235153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0926c6f8-5cea-4436-a832-15b9b25b565c-kserve-provision-location\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.235281 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.235239 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0926c6f8-5cea-4436-a832-15b9b25b565c-isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.335972 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.335937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0926c6f8-5cea-4436-a832-15b9b25b565c-isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.336153 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.336015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0926c6f8-5cea-4436-a832-15b9b25b565c-proxy-tls\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.336153 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.336034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vllhm\" (UniqueName: \"kubernetes.io/projected/0926c6f8-5cea-4436-a832-15b9b25b565c-kube-api-access-vllhm\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.336153 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.336052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0926c6f8-5cea-4436-a832-15b9b25b565c-kserve-provision-location\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.336489 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.336467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0926c6f8-5cea-4436-a832-15b9b25b565c-kserve-provision-location\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.336625 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.336604 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0926c6f8-5cea-4436-a832-15b9b25b565c-isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.338481 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.338463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0926c6f8-5cea-4436-a832-15b9b25b565c-proxy-tls\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.344909 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.344887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vllhm\" (UniqueName: \"kubernetes.io/projected/0926c6f8-5cea-4436-a832-15b9b25b565c-kube-api-access-vllhm\") pod \"isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.488284 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.488249 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:47.616032 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:47.615894 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp"] Apr 28 19:29:47.618795 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:29:47.618758 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0926c6f8_5cea_4436_a832_15b9b25b565c.slice/crio-a6c7ee5fd5029ab47740dcc914ec77a060ee2d35b2ae418e4fe5fede72feef00 WatchSource:0}: Error finding container a6c7ee5fd5029ab47740dcc914ec77a060ee2d35b2ae418e4fe5fede72feef00: Status 404 returned error can't find the container with id a6c7ee5fd5029ab47740dcc914ec77a060ee2d35b2ae418e4fe5fede72feef00 Apr 28 19:29:48.421257 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:48.421214 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerStarted","Data":"add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263"} Apr 28 19:29:48.421257 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:48.421260 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerStarted","Data":"a6c7ee5fd5029ab47740dcc914ec77a060ee2d35b2ae418e4fe5fede72feef00"} Apr 28 19:29:51.430255 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:51.430218 2572 generic.go:358] "Generic (PLEG): container finished" podID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerID="add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263" exitCode=0 Apr 28 19:29:51.430604 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:51.430286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerDied","Data":"add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263"} Apr 28 19:29:52.435862 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.435826 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerStarted","Data":"be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720"} Apr 28 19:29:52.435862 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.435865 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerStarted","Data":"8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6"} Apr 28 19:29:52.436375 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.435875 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerStarted","Data":"1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933"} Apr 28 19:29:52.436375 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.436344 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:52.436453 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.436383 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:52.436453 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.436391 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:52.437614 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.437580 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:29:52.438186 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.438149 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:29:52.461257 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:52.458191 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podStartSLOduration=5.458144628 podStartE2EDuration="5.458144628s" podCreationTimestamp="2026-04-28 19:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:29:52.456510525 +0000 UTC m=+834.128955629" watchObservedRunningTime="2026-04-28 19:29:52.458144628 +0000 UTC m=+834.130589733" Apr 28 19:29:53.439964 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:53.439921 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:29:53.440437 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:53.440381 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:29:58.443591 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:58.443562 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:29:58.444194 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:58.444145 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:29:58.444431 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:29:58.444409 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:08.444915 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:08.444873 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:30:08.445425 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:08.445371 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:18.444849 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:18.444801 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:30:18.445373 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:18.445350 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:28.444394 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:28.444350 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:30:28.444852 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:28.444828 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:38.444756 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:38.444708 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:30:38.445324 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:38.445263 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:48.444152 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:48.444110 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:30:48.444639 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:48.444586 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:30:58.444527 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:58.444494 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:30:58.445125 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:30:58.444682 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:31:11.357287 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.357243 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq_c323af27-9a6e-4bba-86e5-6e631c80460b/kserve-container/0.log" Apr 28 19:31:11.657726 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.657647 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq"] Apr 28 19:31:11.657958 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.657928 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerName="kserve-container" containerID="cri-o://acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753" gracePeriod=30 Apr 28 19:31:11.658098 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.657971 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerName="kube-rbac-proxy" containerID="cri-o://af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855" gracePeriod=30 Apr 28 19:31:11.899075 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.899051 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:31:11.926054 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.925979 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c323af27-9a6e-4bba-86e5-6e631c80460b-message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\") pod \"c323af27-9a6e-4bba-86e5-6e631c80460b\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " Apr 28 19:31:11.926192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.926054 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6krr\" (UniqueName: \"kubernetes.io/projected/c323af27-9a6e-4bba-86e5-6e631c80460b-kube-api-access-j6krr\") pod \"c323af27-9a6e-4bba-86e5-6e631c80460b\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " Apr 28 19:31:11.926192 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.926134 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls\") pod \"c323af27-9a6e-4bba-86e5-6e631c80460b\" (UID: \"c323af27-9a6e-4bba-86e5-6e631c80460b\") " Apr 28 19:31:11.926392 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.926366 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c323af27-9a6e-4bba-86e5-6e631c80460b-message-dumper-raw-a8bff-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-a8bff-kube-rbac-proxy-sar-config") pod "c323af27-9a6e-4bba-86e5-6e631c80460b" (UID: "c323af27-9a6e-4bba-86e5-6e631c80460b"). InnerVolumeSpecName "message-dumper-raw-a8bff-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:31:11.928077 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.928054 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c323af27-9a6e-4bba-86e5-6e631c80460b" (UID: "c323af27-9a6e-4bba-86e5-6e631c80460b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:31:11.928194 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:11.928103 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c323af27-9a6e-4bba-86e5-6e631c80460b-kube-api-access-j6krr" (OuterVolumeSpecName: "kube-api-access-j6krr") pod "c323af27-9a6e-4bba-86e5-6e631c80460b" (UID: "c323af27-9a6e-4bba-86e5-6e631c80460b"). InnerVolumeSpecName "kube-api-access-j6krr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:31:12.027492 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.027458 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j6krr\" (UniqueName: \"kubernetes.io/projected/c323af27-9a6e-4bba-86e5-6e631c80460b-kube-api-access-j6krr\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:31:12.027492 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.027485 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c323af27-9a6e-4bba-86e5-6e631c80460b-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:31:12.027492 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.027495 2572 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c323af27-9a6e-4bba-86e5-6e631c80460b-message-dumper-raw-a8bff-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:31:12.165944 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.165906 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp"] Apr 28 19:31:12.166434 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.166321 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" containerID="cri-o://1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933" gracePeriod=30 Apr 28 19:31:12.166434 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.166330 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" containerID="cri-o://be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720" gracePeriod=30 Apr 28 19:31:12.166434 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.166365 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" containerID="cri-o://8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6" gracePeriod=30 Apr 28 19:31:12.381330 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.381295 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7"] Apr 28 19:31:12.381793 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.381698 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerName="kserve-container" Apr 28 19:31:12.381793 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.381714 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerName="kserve-container" Apr 28 19:31:12.381793 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.381733 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerName="kube-rbac-proxy" Apr 28 19:31:12.381793 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.381741 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerName="kube-rbac-proxy" Apr 28 19:31:12.382048 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.381799 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerName="kube-rbac-proxy" Apr 28 19:31:12.382048 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.381811 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerName="kserve-container" Apr 28 19:31:12.385041 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.385019 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.388372 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.388344 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\"" Apr 28 19:31:12.388493 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.388387 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-9745c-predictor-serving-cert\"" Apr 28 19:31:12.402367 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.402338 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7"] Apr 28 19:31:12.431607 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.431577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d14f466-716d-4651-92f3-f764554541ae-proxy-tls\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.431755 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.431627 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d14f466-716d-4651-92f3-f764554541ae-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.431755 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.431694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d14f466-716d-4651-92f3-f764554541ae-isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.431755 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.431741 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xpkr\" (UniqueName: \"kubernetes.io/projected/9d14f466-716d-4651-92f3-f764554541ae-kube-api-access-7xpkr\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.533127 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.533030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d14f466-716d-4651-92f3-f764554541ae-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.533127 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.533084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d14f466-716d-4651-92f3-f764554541ae-isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.533127 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.533107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xpkr\" (UniqueName: \"kubernetes.io/projected/9d14f466-716d-4651-92f3-f764554541ae-kube-api-access-7xpkr\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.533379 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.533151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d14f466-716d-4651-92f3-f764554541ae-proxy-tls\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.533543 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.533516 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d14f466-716d-4651-92f3-f764554541ae-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.533797 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.533778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d14f466-716d-4651-92f3-f764554541ae-isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.535482 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.535465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d14f466-716d-4651-92f3-f764554541ae-proxy-tls\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.541734 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.541714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xpkr\" (UniqueName: \"kubernetes.io/projected/9d14f466-716d-4651-92f3-f764554541ae-kube-api-access-7xpkr\") pod \"isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.658323 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.658287 2572 generic.go:358] "Generic (PLEG): container finished" podID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerID="af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855" exitCode=2 Apr 28 19:31:12.658323 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.658314 2572 generic.go:358] "Generic (PLEG): container finished" podID="c323af27-9a6e-4bba-86e5-6e631c80460b" containerID="acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753" exitCode=2 Apr 28 19:31:12.658547 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.658357 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" Apr 28 19:31:12.658547 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.658370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" event={"ID":"c323af27-9a6e-4bba-86e5-6e631c80460b","Type":"ContainerDied","Data":"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855"} Apr 28 19:31:12.658547 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.658409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" event={"ID":"c323af27-9a6e-4bba-86e5-6e631c80460b","Type":"ContainerDied","Data":"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753"} Apr 28 19:31:12.658547 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.658421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq" event={"ID":"c323af27-9a6e-4bba-86e5-6e631c80460b","Type":"ContainerDied","Data":"46313cf29e5ab7f67d8817ec134074c8adc7f8f285e9b0d09aee2d763402a934"} Apr 28 19:31:12.658547 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.658438 2572 scope.go:117] "RemoveContainer" containerID="af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855" Apr 28 19:31:12.660639 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.660618 2572 generic.go:358] "Generic (PLEG): container finished" podID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerID="8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6" exitCode=2 Apr 28 19:31:12.660734 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.660658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerDied","Data":"8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6"} Apr 28 19:31:12.666701 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.666684 2572 scope.go:117] "RemoveContainer" containerID="acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753" Apr 28 19:31:12.674740 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.674715 2572 scope.go:117] "RemoveContainer" containerID="af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855" Apr 28 19:31:12.674999 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:31:12.674980 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855\": container with ID starting with af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855 not found: ID does not exist" containerID="af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855" Apr 28 19:31:12.675061 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.675008 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855"} err="failed to get container status \"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855\": rpc error: code = NotFound desc = could not find container \"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855\": container with ID starting with af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855 not found: ID does not exist" Apr 28 19:31:12.675061 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.675025 2572 scope.go:117] "RemoveContainer" containerID="acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753" Apr 28 19:31:12.675292 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:31:12.675272 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753\": container with ID starting with acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753 not found: ID does not exist" containerID="acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753" Apr 28 19:31:12.675362 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.675296 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753"} err="failed to get container status \"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753\": rpc error: code = NotFound desc = could not find container \"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753\": container with ID starting with acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753 not found: ID does not exist" Apr 28 19:31:12.675362 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.675312 2572 scope.go:117] "RemoveContainer" containerID="af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855" Apr 28 19:31:12.675500 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.675483 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855"} err="failed to get container status \"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855\": rpc error: code = NotFound desc = could not find container \"af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855\": container with ID starting with af9e086cf1f961382511602295e1e77a27d1a432dc9d7af084cf95500bb29855 not found: ID does not exist" Apr 28 19:31:12.675500 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.675500 2572 scope.go:117] "RemoveContainer" containerID="acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753" Apr 28 19:31:12.675718 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.675700 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753"} err="failed to get container status \"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753\": rpc error: code = NotFound desc = could not find container \"acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753\": container with ID starting with acef1045314611fa8d0fb8470dd97ece8bfdd149af59c10f4ae778488d0d4753 not found: ID does not exist" Apr 28 19:31:12.693117 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.693091 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq"] Apr 28 19:31:12.694795 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.694770 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:12.702826 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.702801 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-a8bff-predictor-6d6466ff77-7bxkq"] Apr 28 19:31:12.814427 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.814352 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7"] Apr 28 19:31:12.817864 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:31:12.817833 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d14f466_716d_4651_92f3_f764554541ae.slice/crio-90ddc9118f043b35f1f2f37afcf656bdf868d8a92c222d8ed7c940e8a7e03256 WatchSource:0}: Error finding container 90ddc9118f043b35f1f2f37afcf656bdf868d8a92c222d8ed7c940e8a7e03256: Status 404 returned error can't find the container with id 90ddc9118f043b35f1f2f37afcf656bdf868d8a92c222d8ed7c940e8a7e03256 Apr 28 19:31:12.888611 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:12.888586 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c323af27-9a6e-4bba-86e5-6e631c80460b" path="/var/lib/kubelet/pods/c323af27-9a6e-4bba-86e5-6e631c80460b/volumes" Apr 28 19:31:13.440650 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:13.440611 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 28 19:31:13.664803 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:13.664765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" event={"ID":"9d14f466-716d-4651-92f3-f764554541ae","Type":"ContainerStarted","Data":"2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580"} Apr 28 19:31:13.664803 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:13.664809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" event={"ID":"9d14f466-716d-4651-92f3-f764554541ae","Type":"ContainerStarted","Data":"90ddc9118f043b35f1f2f37afcf656bdf868d8a92c222d8ed7c940e8a7e03256"} Apr 28 19:31:16.675930 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:16.675836 2572 generic.go:358] "Generic (PLEG): container finished" podID="9d14f466-716d-4651-92f3-f764554541ae" containerID="2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580" exitCode=0 Apr 28 19:31:16.675930 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:16.675911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" event={"ID":"9d14f466-716d-4651-92f3-f764554541ae","Type":"ContainerDied","Data":"2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580"} Apr 28 19:31:16.678156 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:16.678131 2572 generic.go:358] "Generic (PLEG): container finished" podID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerID="1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933" exitCode=0 Apr 28 19:31:16.678269 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:16.678211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerDied","Data":"1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933"} Apr 28 19:31:17.682972 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:17.682933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" event={"ID":"9d14f466-716d-4651-92f3-f764554541ae","Type":"ContainerStarted","Data":"da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806"} Apr 28 19:31:17.682972 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:17.682973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" event={"ID":"9d14f466-716d-4651-92f3-f764554541ae","Type":"ContainerStarted","Data":"d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870"} Apr 28 19:31:17.683398 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:17.683275 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:17.703708 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:17.703651 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podStartSLOduration=5.703630536 podStartE2EDuration="5.703630536s" podCreationTimestamp="2026-04-28 19:31:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:31:17.701483963 +0000 UTC m=+919.373929066" watchObservedRunningTime="2026-04-28 19:31:17.703630536 +0000 UTC m=+919.376075635" Apr 28 19:31:18.441074 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:18.441036 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 28 19:31:18.444346 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:18.444316 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:31:18.444737 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:18.444709 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:18.685967 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:18.685929 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:18.687093 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:18.687066 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:31:19.688225 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:19.688160 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:31:23.440782 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:23.440734 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 28 19:31:23.441224 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:23.440854 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:31:24.692401 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:24.692370 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:31:24.692971 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:24.692934 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:31:28.440984 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:28.440925 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 28 19:31:28.444264 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:28.444223 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:31:28.444604 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:28.444575 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:33.440130 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:33.440086 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 28 19:31:34.693868 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:34.693832 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:31:38.440826 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:38.440781 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 28 19:31:38.444184 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:38.444133 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 28 19:31:38.444319 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:38.444302 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:31:38.444492 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:38.444468 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:31:38.444597 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:38.444584 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:31:42.328954 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.328925 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:31:42.467284 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.467251 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0926c6f8-5cea-4436-a832-15b9b25b565c-proxy-tls\") pod \"0926c6f8-5cea-4436-a832-15b9b25b565c\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " Apr 28 19:31:42.467456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.467298 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0926c6f8-5cea-4436-a832-15b9b25b565c-kserve-provision-location\") pod \"0926c6f8-5cea-4436-a832-15b9b25b565c\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " Apr 28 19:31:42.467456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.467353 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0926c6f8-5cea-4436-a832-15b9b25b565c-isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\") pod \"0926c6f8-5cea-4436-a832-15b9b25b565c\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " Apr 28 19:31:42.467566 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.467461 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vllhm\" (UniqueName: \"kubernetes.io/projected/0926c6f8-5cea-4436-a832-15b9b25b565c-kube-api-access-vllhm\") pod \"0926c6f8-5cea-4436-a832-15b9b25b565c\" (UID: \"0926c6f8-5cea-4436-a832-15b9b25b565c\") " Apr 28 19:31:42.467699 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.467674 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0926c6f8-5cea-4436-a832-15b9b25b565c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0926c6f8-5cea-4436-a832-15b9b25b565c" (UID: "0926c6f8-5cea-4436-a832-15b9b25b565c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:31:42.467751 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.467696 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0926c6f8-5cea-4436-a832-15b9b25b565c-isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config") pod "0926c6f8-5cea-4436-a832-15b9b25b565c" (UID: "0926c6f8-5cea-4436-a832-15b9b25b565c"). InnerVolumeSpecName "isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:31:42.469403 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.469380 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0926c6f8-5cea-4436-a832-15b9b25b565c-kube-api-access-vllhm" (OuterVolumeSpecName: "kube-api-access-vllhm") pod "0926c6f8-5cea-4436-a832-15b9b25b565c" (UID: "0926c6f8-5cea-4436-a832-15b9b25b565c"). InnerVolumeSpecName "kube-api-access-vllhm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:31:42.469470 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.469417 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0926c6f8-5cea-4436-a832-15b9b25b565c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0926c6f8-5cea-4436-a832-15b9b25b565c" (UID: "0926c6f8-5cea-4436-a832-15b9b25b565c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:31:42.568617 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.568582 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0926c6f8-5cea-4436-a832-15b9b25b565c-isvc-logger-raw-a8bff-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:31:42.568617 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.568611 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vllhm\" (UniqueName: \"kubernetes.io/projected/0926c6f8-5cea-4436-a832-15b9b25b565c-kube-api-access-vllhm\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:31:42.568617 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.568621 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0926c6f8-5cea-4436-a832-15b9b25b565c-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:31:42.568828 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.568633 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0926c6f8-5cea-4436-a832-15b9b25b565c-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:31:42.752214 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.752101 2572 generic.go:358] "Generic (PLEG): container finished" podID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerID="be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720" exitCode=137 Apr 28 19:31:42.752214 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.752201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerDied","Data":"be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720"} Apr 28 19:31:42.752399 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.752222 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" Apr 28 19:31:42.752399 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.752240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp" event={"ID":"0926c6f8-5cea-4436-a832-15b9b25b565c","Type":"ContainerDied","Data":"a6c7ee5fd5029ab47740dcc914ec77a060ee2d35b2ae418e4fe5fede72feef00"} Apr 28 19:31:42.752399 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.752256 2572 scope.go:117] "RemoveContainer" containerID="be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720" Apr 28 19:31:42.761603 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.760989 2572 scope.go:117] "RemoveContainer" containerID="8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6" Apr 28 19:31:42.768323 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.768303 2572 scope.go:117] "RemoveContainer" containerID="1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933" Apr 28 19:31:42.773937 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.773913 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp"] Apr 28 19:31:42.775151 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.775131 2572 scope.go:117] "RemoveContainer" containerID="add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263" Apr 28 19:31:42.779711 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.779690 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-a8bff-predictor-b8fd65b67-48kfp"] Apr 28 19:31:42.782160 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.782144 2572 scope.go:117] "RemoveContainer" containerID="be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720" Apr 28 19:31:42.782433 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:31:42.782416 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720\": container with ID starting with be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720 not found: ID does not exist" containerID="be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720" Apr 28 19:31:42.782483 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.782442 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720"} err="failed to get container status \"be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720\": rpc error: code = NotFound desc = could not find container \"be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720\": container with ID starting with be9282d370743d3c0454f742ca304e4e23b2b1f694ca10aab16c97b059dbf720 not found: ID does not exist" Apr 28 19:31:42.782483 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.782459 2572 scope.go:117] "RemoveContainer" containerID="8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6" Apr 28 19:31:42.782679 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:31:42.782664 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6\": container with ID starting with 8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6 not found: ID does not exist" containerID="8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6" Apr 28 19:31:42.782720 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.782682 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6"} err="failed to get container status \"8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6\": rpc error: code = NotFound desc = could not find container \"8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6\": container with ID starting with 8ee8c31794d5eaaabb9c44318a65e063f094ab5470961379b7e18f69d6c191c6 not found: ID does not exist" Apr 28 19:31:42.782720 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.782698 2572 scope.go:117] "RemoveContainer" containerID="1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933" Apr 28 19:31:42.782908 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:31:42.782890 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933\": container with ID starting with 1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933 not found: ID does not exist" containerID="1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933" Apr 28 19:31:42.782973 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.782917 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933"} err="failed to get container status \"1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933\": rpc error: code = NotFound desc = could not find container \"1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933\": container with ID starting with 1835b8b11a9a6b801ee178825213af42dfdaa6be9af873a5d139c747adbf1933 not found: ID does not exist" Apr 28 19:31:42.782973 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.782939 2572 scope.go:117] "RemoveContainer" containerID="add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263" Apr 28 19:31:42.783146 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:31:42.783130 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263\": container with ID starting with add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263 not found: ID does not exist" containerID="add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263" Apr 28 19:31:42.783201 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.783149 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263"} err="failed to get container status \"add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263\": rpc error: code = NotFound desc = could not find container \"add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263\": container with ID starting with add3f78885228066908271e2506c8f7b49cd0571b2cc5d42d22e0280c07a2263 not found: ID does not exist" Apr 28 19:31:42.888567 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:42.888533 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" path="/var/lib/kubelet/pods/0926c6f8-5cea-4436-a832-15b9b25b565c/volumes" Apr 28 19:31:44.693719 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:44.693683 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:31:54.693091 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:31:54.693047 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:32:04.693199 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:32:04.693137 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:32:14.693011 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:32:14.692909 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:32:24.693227 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:32:24.693168 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:32:34.693720 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:32:34.693682 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:32:44.693351 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:32:44.693306 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:32:47.885692 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:32:47.885644 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:32:57.886066 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:32:57.886023 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:33:07.886228 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:07.886158 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:33:17.885725 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:17.885668 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:33:27.886066 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:27.886033 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:33:32.304567 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.304529 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7"] Apr 28 19:33:32.305032 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.304891 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" containerID="cri-o://d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870" gracePeriod=30 Apr 28 19:33:32.305032 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.304931 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kube-rbac-proxy" containerID="cri-o://da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806" gracePeriod=30 Apr 28 19:33:32.622842 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.622747 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n"] Apr 28 19:33:32.623116 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623098 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" Apr 28 19:33:32.623223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623120 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" Apr 28 19:33:32.623223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623131 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" Apr 28 19:33:32.623223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623139 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" Apr 28 19:33:32.623223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623158 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="storage-initializer" Apr 28 19:33:32.623223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623168 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="storage-initializer" Apr 28 19:33:32.623223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623211 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" Apr 28 19:33:32.623223 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623220 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" Apr 28 19:33:32.623618 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623292 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kube-rbac-proxy" Apr 28 19:33:32.623618 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623308 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="agent" Apr 28 19:33:32.623618 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.623319 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0926c6f8-5cea-4436-a832-15b9b25b565c" containerName="kserve-container" Apr 28 19:33:32.626339 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.626315 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.628720 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.628699 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\"" Apr 28 19:33:32.628911 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.628887 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-b9e6a0-predictor-serving-cert\"" Apr 28 19:33:32.635904 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.635880 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n"] Apr 28 19:33:32.737051 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.737010 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.737254 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.737112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hft2\" (UniqueName: \"kubernetes.io/projected/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kube-api-access-7hft2\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.737254 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.737164 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-proxy-tls\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.737254 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.737239 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kserve-provision-location\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.838508 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.838468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hft2\" (UniqueName: \"kubernetes.io/projected/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kube-api-access-7hft2\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.838683 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.838519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-proxy-tls\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.838781 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.838757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kserve-provision-location\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.838846 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.838828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.839188 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.839153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kserve-provision-location\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.839491 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.839473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.840874 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.840857 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-proxy-tls\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.847071 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.847041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hft2\" (UniqueName: \"kubernetes.io/projected/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kube-api-access-7hft2\") pod \"isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:32.937281 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:32.937200 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:33.038695 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:33.038660 2572 generic.go:358] "Generic (PLEG): container finished" podID="9d14f466-716d-4651-92f3-f764554541ae" containerID="da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806" exitCode=2 Apr 28 19:33:33.038847 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:33.038708 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" event={"ID":"9d14f466-716d-4651-92f3-f764554541ae","Type":"ContainerDied","Data":"da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806"} Apr 28 19:33:33.057347 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:33.057326 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n"] Apr 28 19:33:33.059533 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:33:33.059507 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ef18fbd_984f_4923_a03a_bdf19cc4cd9a.slice/crio-424e45e1cbd33d0fef352425298214ec512f79203dac8f2868e57f6194f4f47b WatchSource:0}: Error finding container 424e45e1cbd33d0fef352425298214ec512f79203dac8f2868e57f6194f4f47b: Status 404 returned error can't find the container with id 424e45e1cbd33d0fef352425298214ec512f79203dac8f2868e57f6194f4f47b Apr 28 19:33:33.061267 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:33.061250 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:33:34.044655 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:34.044615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" event={"ID":"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a","Type":"ContainerStarted","Data":"7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2"} Apr 28 19:33:34.044655 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:34.044658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" event={"ID":"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a","Type":"ContainerStarted","Data":"424e45e1cbd33d0fef352425298214ec512f79203dac8f2868e57f6194f4f47b"} Apr 28 19:33:34.688485 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:34.688431 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 28 19:33:37.053978 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:37.053938 2572 generic.go:358] "Generic (PLEG): container finished" podID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerID="7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2" exitCode=0 Apr 28 19:33:37.053978 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:37.053978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" event={"ID":"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a","Type":"ContainerDied","Data":"7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2"} Apr 28 19:33:37.886577 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:37.886532 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 28 19:33:38.058322 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:38.058286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" event={"ID":"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a","Type":"ContainerStarted","Data":"c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089"} Apr 28 19:33:38.058322 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:38.058326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" event={"ID":"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a","Type":"ContainerStarted","Data":"defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b"} Apr 28 19:33:38.058729 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:38.058537 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:38.079538 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:38.079488 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podStartSLOduration=6.079473663 podStartE2EDuration="6.079473663s" podCreationTimestamp="2026-04-28 19:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:33:38.078337259 +0000 UTC m=+1059.750782401" watchObservedRunningTime="2026-04-28 19:33:38.079473663 +0000 UTC m=+1059.751918765" Apr 28 19:33:39.061413 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:39.061385 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:39.062484 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:39.062457 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:33:39.688952 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:39.688902 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 28 19:33:40.063775 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:40.063731 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:33:41.943126 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:41.943103 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:33:42.012483 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.012447 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d14f466-716d-4651-92f3-f764554541ae-kserve-provision-location\") pod \"9d14f466-716d-4651-92f3-f764554541ae\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " Apr 28 19:33:42.012483 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.012490 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d14f466-716d-4651-92f3-f764554541ae-isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\") pod \"9d14f466-716d-4651-92f3-f764554541ae\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " Apr 28 19:33:42.012727 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.012511 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xpkr\" (UniqueName: \"kubernetes.io/projected/9d14f466-716d-4651-92f3-f764554541ae-kube-api-access-7xpkr\") pod \"9d14f466-716d-4651-92f3-f764554541ae\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " Apr 28 19:33:42.012727 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.012582 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d14f466-716d-4651-92f3-f764554541ae-proxy-tls\") pod \"9d14f466-716d-4651-92f3-f764554541ae\" (UID: \"9d14f466-716d-4651-92f3-f764554541ae\") " Apr 28 19:33:42.012871 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.012838 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d14f466-716d-4651-92f3-f764554541ae-isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config") pod "9d14f466-716d-4651-92f3-f764554541ae" (UID: "9d14f466-716d-4651-92f3-f764554541ae"). InnerVolumeSpecName "isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:33:42.012928 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.012844 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d14f466-716d-4651-92f3-f764554541ae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9d14f466-716d-4651-92f3-f764554541ae" (UID: "9d14f466-716d-4651-92f3-f764554541ae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:33:42.014644 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.014618 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d14f466-716d-4651-92f3-f764554541ae-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9d14f466-716d-4651-92f3-f764554541ae" (UID: "9d14f466-716d-4651-92f3-f764554541ae"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:33:42.014723 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.014701 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d14f466-716d-4651-92f3-f764554541ae-kube-api-access-7xpkr" (OuterVolumeSpecName: "kube-api-access-7xpkr") pod "9d14f466-716d-4651-92f3-f764554541ae" (UID: "9d14f466-716d-4651-92f3-f764554541ae"). InnerVolumeSpecName "kube-api-access-7xpkr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:33:42.072005 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.071967 2572 generic.go:358] "Generic (PLEG): container finished" podID="9d14f466-716d-4651-92f3-f764554541ae" containerID="d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870" exitCode=0 Apr 28 19:33:42.072208 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.072044 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" event={"ID":"9d14f466-716d-4651-92f3-f764554541ae","Type":"ContainerDied","Data":"d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870"} Apr 28 19:33:42.072208 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.072060 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" Apr 28 19:33:42.072208 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.072083 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7" event={"ID":"9d14f466-716d-4651-92f3-f764554541ae","Type":"ContainerDied","Data":"90ddc9118f043b35f1f2f37afcf656bdf868d8a92c222d8ed7c940e8a7e03256"} Apr 28 19:33:42.072208 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.072099 2572 scope.go:117] "RemoveContainer" containerID="da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806" Apr 28 19:33:42.079881 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.079861 2572 scope.go:117] "RemoveContainer" containerID="d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870" Apr 28 19:33:42.086971 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.086954 2572 scope.go:117] "RemoveContainer" containerID="2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580" Apr 28 19:33:42.094319 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.094298 2572 scope.go:117] "RemoveContainer" containerID="da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806" Apr 28 19:33:42.094463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.094439 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7"] Apr 28 19:33:42.094565 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:33:42.094545 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806\": container with ID starting with da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806 not found: ID does not exist" containerID="da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806" Apr 28 19:33:42.094614 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.094569 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806"} err="failed to get container status \"da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806\": rpc error: code = NotFound desc = could not find container \"da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806\": container with ID starting with da3b8d135f5db06353346d44d1c134210facd6456b14afd1f8035a342e4c1806 not found: ID does not exist" Apr 28 19:33:42.094614 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.094588 2572 scope.go:117] "RemoveContainer" containerID="d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870" Apr 28 19:33:42.094820 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:33:42.094802 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870\": container with ID starting with d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870 not found: ID does not exist" containerID="d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870" Apr 28 19:33:42.094860 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.094826 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870"} err="failed to get container status \"d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870\": rpc error: code = NotFound desc = could not find container \"d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870\": container with ID starting with d0525abaad8ad6b8a30beb26b851ea17b4783bd9182a30869e16de72f2846870 not found: ID does not exist" Apr 28 19:33:42.094860 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.094842 2572 scope.go:117] "RemoveContainer" containerID="2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580" Apr 28 19:33:42.095071 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:33:42.095055 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580\": container with ID starting with 2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580 not found: ID does not exist" containerID="2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580" Apr 28 19:33:42.095114 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.095077 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580"} err="failed to get container status \"2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580\": rpc error: code = NotFound desc = could not find container \"2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580\": container with ID starting with 2b5f027f74a18e554c40854642ecc4e26ed79345b586779a426104317fbf4580 not found: ID does not exist" Apr 28 19:33:42.098250 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.098230 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-9745c-predictor-74db4886f-jb4q7"] Apr 28 19:33:42.113915 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.113890 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d14f466-716d-4651-92f3-f764554541ae-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:33:42.113915 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.113913 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d14f466-716d-4651-92f3-f764554541ae-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:33:42.114042 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.113923 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d14f466-716d-4651-92f3-f764554541ae-isvc-sklearn-scale-raw-9745c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:33:42.114042 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.113933 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xpkr\" (UniqueName: \"kubernetes.io/projected/9d14f466-716d-4651-92f3-f764554541ae-kube-api-access-7xpkr\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:33:42.889045 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:42.889012 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d14f466-716d-4651-92f3-f764554541ae" path="/var/lib/kubelet/pods/9d14f466-716d-4651-92f3-f764554541ae/volumes" Apr 28 19:33:45.067955 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:45.067919 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:33:45.068461 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:45.068424 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:33:55.068434 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:33:55.068392 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:34:05.069234 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:05.069191 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:34:15.068957 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:15.068901 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:34:25.068831 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:25.068791 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:34:35.069191 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:35.069133 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 28 19:34:45.069125 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:45.069094 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:34:51.959711 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.959676 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg"] Apr 28 19:34:51.960131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.959926 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" Apr 28 19:34:51.960131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.959937 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" Apr 28 19:34:51.960131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.959953 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kube-rbac-proxy" Apr 28 19:34:51.960131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.959959 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kube-rbac-proxy" Apr 28 19:34:51.960131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.959968 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="storage-initializer" Apr 28 19:34:51.960131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.959973 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="storage-initializer" Apr 28 19:34:51.960131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.960014 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kube-rbac-proxy" Apr 28 19:34:51.960131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.960022 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d14f466-716d-4651-92f3-f764554541ae" containerName="kserve-container" Apr 28 19:34:51.962924 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.962904 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:51.965528 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.965507 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-b9e6a0-predictor-serving-cert\"" Apr 28 19:34:51.965638 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.965581 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 28 19:34:51.965704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.965511 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-b9e6a0-dockercfg-qw6q8\"" Apr 28 19:34:51.965931 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.965916 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-b9e6a0\"" Apr 28 19:34:51.966507 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.966489 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\"" Apr 28 19:34:51.974458 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:51.974440 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg"] Apr 28 19:34:52.044397 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.044366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.044569 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.044407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.044569 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.044467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-cabundle-cert\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.044569 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.044530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75n9r\" (UniqueName: \"kubernetes.io/projected/a43f12ab-b64a-4dee-aea6-defd4e555c99-kube-api-access-75n9r\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.044693 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.044606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a43f12ab-b64a-4dee-aea6-defd4e555c99-kserve-provision-location\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.145211 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.145164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.145371 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.145217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.145371 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.145237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-cabundle-cert\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.145371 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:34:52.145334 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-serving-cert: secret "isvc-secondary-b9e6a0-predictor-serving-cert" not found Apr 28 19:34:52.145524 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:34:52.145421 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls podName:a43f12ab-b64a-4dee-aea6-defd4e555c99 nodeName:}" failed. No retries permitted until 2026-04-28 19:34:52.645397945 +0000 UTC m=+1134.317843045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls") pod "isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" (UID: "a43f12ab-b64a-4dee-aea6-defd4e555c99") : secret "isvc-secondary-b9e6a0-predictor-serving-cert" not found Apr 28 19:34:52.145524 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.145341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75n9r\" (UniqueName: \"kubernetes.io/projected/a43f12ab-b64a-4dee-aea6-defd4e555c99-kube-api-access-75n9r\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.145628 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.145525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a43f12ab-b64a-4dee-aea6-defd4e555c99-kserve-provision-location\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.145870 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.145854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-cabundle-cert\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.145904 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.145870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a43f12ab-b64a-4dee-aea6-defd4e555c99-kserve-provision-location\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.145946 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.145929 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.153832 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.153801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75n9r\" (UniqueName: \"kubernetes.io/projected/a43f12ab-b64a-4dee-aea6-defd4e555c99-kube-api-access-75n9r\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.649806 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.649757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.652164 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.652144 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls\") pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.873566 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.873528 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:34:52.993220 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:52.993193 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg"] Apr 28 19:34:52.995162 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:34:52.995138 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda43f12ab_b64a_4dee_aea6_defd4e555c99.slice/crio-097f3eae44ae08ddd6fda6c6c49adce1699711a38a4ec276ab7554a15633b559 WatchSource:0}: Error finding container 097f3eae44ae08ddd6fda6c6c49adce1699711a38a4ec276ab7554a15633b559: Status 404 returned error can't find the container with id 097f3eae44ae08ddd6fda6c6c49adce1699711a38a4ec276ab7554a15633b559 Apr 28 19:34:53.269676 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:53.269632 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" event={"ID":"a43f12ab-b64a-4dee-aea6-defd4e555c99","Type":"ContainerStarted","Data":"2e6367c8fc697f6e68c6769f8959efa81488339871de48e26a1a801e39e68973"} Apr 28 19:34:53.269676 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:53.269683 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" event={"ID":"a43f12ab-b64a-4dee-aea6-defd4e555c99","Type":"ContainerStarted","Data":"097f3eae44ae08ddd6fda6c6c49adce1699711a38a4ec276ab7554a15633b559"} Apr 28 19:34:57.280791 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:57.280757 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_a43f12ab-b64a-4dee-aea6-defd4e555c99/storage-initializer/0.log" Apr 28 19:34:57.281244 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:57.280801 2572 generic.go:358] "Generic (PLEG): container finished" podID="a43f12ab-b64a-4dee-aea6-defd4e555c99" containerID="2e6367c8fc697f6e68c6769f8959efa81488339871de48e26a1a801e39e68973" exitCode=1 Apr 28 19:34:57.281244 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:57.280879 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" event={"ID":"a43f12ab-b64a-4dee-aea6-defd4e555c99","Type":"ContainerDied","Data":"2e6367c8fc697f6e68c6769f8959efa81488339871de48e26a1a801e39e68973"} Apr 28 19:34:58.285237 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:58.285209 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_a43f12ab-b64a-4dee-aea6-defd4e555c99/storage-initializer/0.log" Apr 28 19:34:58.285603 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:34:58.285299 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" event={"ID":"a43f12ab-b64a-4dee-aea6-defd4e555c99","Type":"ContainerStarted","Data":"87aff304e0953eb0b38b7d3fdba26b70906c95f8c5a7db8ce4b3b18b9ae2c965"} Apr 28 19:35:02.297351 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:02.297322 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_a43f12ab-b64a-4dee-aea6-defd4e555c99/storage-initializer/1.log" Apr 28 19:35:02.297778 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:02.297678 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_a43f12ab-b64a-4dee-aea6-defd4e555c99/storage-initializer/0.log" Apr 28 19:35:02.297778 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:02.297707 2572 generic.go:358] "Generic (PLEG): container finished" podID="a43f12ab-b64a-4dee-aea6-defd4e555c99" containerID="87aff304e0953eb0b38b7d3fdba26b70906c95f8c5a7db8ce4b3b18b9ae2c965" exitCode=1 Apr 28 19:35:02.297778 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:02.297765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" event={"ID":"a43f12ab-b64a-4dee-aea6-defd4e555c99","Type":"ContainerDied","Data":"87aff304e0953eb0b38b7d3fdba26b70906c95f8c5a7db8ce4b3b18b9ae2c965"} Apr 28 19:35:02.297931 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:02.297796 2572 scope.go:117] "RemoveContainer" containerID="2e6367c8fc697f6e68c6769f8959efa81488339871de48e26a1a801e39e68973" Apr 28 19:35:02.298222 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:02.298204 2572 scope.go:117] "RemoveContainer" containerID="2e6367c8fc697f6e68c6769f8959efa81488339871de48e26a1a801e39e68973" Apr 28 19:35:02.308107 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:02.308074 2572 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_kserve-ci-e2e-test_a43f12ab-b64a-4dee-aea6-defd4e555c99_0 in pod sandbox 097f3eae44ae08ddd6fda6c6c49adce1699711a38a4ec276ab7554a15633b559 from index: no such id: '2e6367c8fc697f6e68c6769f8959efa81488339871de48e26a1a801e39e68973'" containerID="2e6367c8fc697f6e68c6769f8959efa81488339871de48e26a1a801e39e68973" Apr 28 19:35:02.308189 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:02.308133 2572 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_kserve-ci-e2e-test_a43f12ab-b64a-4dee-aea6-defd4e555c99_0 in pod sandbox 097f3eae44ae08ddd6fda6c6c49adce1699711a38a4ec276ab7554a15633b559 from index: no such id: '2e6367c8fc697f6e68c6769f8959efa81488339871de48e26a1a801e39e68973'; Skipping pod \"isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_kserve-ci-e2e-test(a43f12ab-b64a-4dee-aea6-defd4e555c99)\"" logger="UnhandledError" Apr 28 19:35:02.309417 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:02.309398 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_kserve-ci-e2e-test(a43f12ab-b64a-4dee-aea6-defd4e555c99)\"" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" podUID="a43f12ab-b64a-4dee-aea6-defd4e555c99" Apr 28 19:35:03.301900 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:03.301874 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_a43f12ab-b64a-4dee-aea6-defd4e555c99/storage-initializer/1.log" Apr 28 19:35:08.030656 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.030620 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg"] Apr 28 19:35:08.081236 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.081196 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n"] Apr 28 19:35:08.081771 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.081737 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" containerID="cri-o://defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b" gracePeriod=30 Apr 28 19:35:08.081918 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.081785 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kube-rbac-proxy" containerID="cri-o://c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089" gracePeriod=30 Apr 28 19:35:08.143448 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.143418 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7"] Apr 28 19:35:08.148164 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.148142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.150933 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.150910 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-687035-predictor-serving-cert\"" Apr 28 19:35:08.150933 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.150930 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-687035-kube-rbac-proxy-sar-config\"" Apr 28 19:35:08.151132 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.151016 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-687035\"" Apr 28 19:35:08.151132 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.151050 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-687035-dockercfg-b9vxb\"" Apr 28 19:35:08.158715 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.158690 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7"] Apr 28 19:35:08.199612 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.199592 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_a43f12ab-b64a-4dee-aea6-defd4e555c99/storage-initializer/1.log" Apr 28 19:35:08.199704 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.199652 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:35:08.277399 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277358 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a43f12ab-b64a-4dee-aea6-defd4e555c99-kserve-provision-location\") pod \"a43f12ab-b64a-4dee-aea6-defd4e555c99\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " Apr 28 19:35:08.277575 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277409 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\") pod \"a43f12ab-b64a-4dee-aea6-defd4e555c99\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " Apr 28 19:35:08.277575 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277436 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-cabundle-cert\") pod \"a43f12ab-b64a-4dee-aea6-defd4e555c99\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " Apr 28 19:35:08.277575 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277460 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75n9r\" (UniqueName: \"kubernetes.io/projected/a43f12ab-b64a-4dee-aea6-defd4e555c99-kube-api-access-75n9r\") pod \"a43f12ab-b64a-4dee-aea6-defd4e555c99\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " Apr 28 19:35:08.277575 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277509 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls\") pod \"a43f12ab-b64a-4dee-aea6-defd4e555c99\" (UID: \"a43f12ab-b64a-4dee-aea6-defd4e555c99\") " Apr 28 19:35:08.277759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277716 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-687035-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-isvc-init-fail-687035-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.277759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277703 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a43f12ab-b64a-4dee-aea6-defd4e555c99-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a43f12ab-b64a-4dee-aea6-defd4e555c99" (UID: "a43f12ab-b64a-4dee-aea6-defd4e555c99"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:35:08.277759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-cabundle-cert\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.277916 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.277916 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277824 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm6jt\" (UniqueName: \"kubernetes.io/projected/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kube-api-access-tm6jt\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.277916 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277854 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config") pod "a43f12ab-b64a-4dee-aea6-defd4e555c99" (UID: "a43f12ab-b64a-4dee-aea6-defd4e555c99"). InnerVolumeSpecName "isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:35:08.277916 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kserve-provision-location\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.277916 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277881 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a43f12ab-b64a-4dee-aea6-defd4e555c99" (UID: "a43f12ab-b64a-4dee-aea6-defd4e555c99"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:35:08.278102 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277928 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a43f12ab-b64a-4dee-aea6-defd4e555c99-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:08.278102 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.277939 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-isvc-secondary-b9e6a0-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:08.279715 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.279694 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a43f12ab-b64a-4dee-aea6-defd4e555c99" (UID: "a43f12ab-b64a-4dee-aea6-defd4e555c99"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:35:08.279791 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.279759 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43f12ab-b64a-4dee-aea6-defd4e555c99-kube-api-access-75n9r" (OuterVolumeSpecName: "kube-api-access-75n9r") pod "a43f12ab-b64a-4dee-aea6-defd4e555c99" (UID: "a43f12ab-b64a-4dee-aea6-defd4e555c99"). InnerVolumeSpecName "kube-api-access-75n9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:35:08.316825 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.316792 2572 generic.go:358] "Generic (PLEG): container finished" podID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerID="c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089" exitCode=2 Apr 28 19:35:08.316989 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.316857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" event={"ID":"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a","Type":"ContainerDied","Data":"c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089"} Apr 28 19:35:08.317895 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.317882 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg_a43f12ab-b64a-4dee-aea6-defd4e555c99/storage-initializer/1.log" Apr 28 19:35:08.317986 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.317970 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" event={"ID":"a43f12ab-b64a-4dee-aea6-defd4e555c99","Type":"ContainerDied","Data":"097f3eae44ae08ddd6fda6c6c49adce1699711a38a4ec276ab7554a15633b559"} Apr 28 19:35:08.318027 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.317997 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg" Apr 28 19:35:08.318027 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.318005 2572 scope.go:117] "RemoveContainer" containerID="87aff304e0953eb0b38b7d3fdba26b70906c95f8c5a7db8ce4b3b18b9ae2c965" Apr 28 19:35:08.355120 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.355089 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg"] Apr 28 19:35:08.359031 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.359000 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b9e6a0-predictor-5d7d9858d7-lh2hg"] Apr 28 19:35:08.378547 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.378508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-687035-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-isvc-init-fail-687035-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.378708 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.378556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-cabundle-cert\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.378708 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.378595 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.378708 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.378646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tm6jt\" (UniqueName: \"kubernetes.io/projected/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kube-api-access-tm6jt\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.378851 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:08.378745 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-687035-predictor-serving-cert: secret "isvc-init-fail-687035-predictor-serving-cert" not found Apr 28 19:35:08.378851 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:08.378804 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls podName:df050d88-56c8-49d8-b981-4aaf0edfc7cf nodeName:}" failed. No retries permitted until 2026-04-28 19:35:08.878784363 +0000 UTC m=+1150.551229448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls") pod "isvc-init-fail-687035-predictor-857cbdf89c-qnws7" (UID: "df050d88-56c8-49d8-b981-4aaf0edfc7cf") : secret "isvc-init-fail-687035-predictor-serving-cert" not found Apr 28 19:35:08.378851 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.378791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kserve-provision-location\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.378994 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.378908 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a43f12ab-b64a-4dee-aea6-defd4e555c99-cabundle-cert\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:08.378994 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.378928 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75n9r\" (UniqueName: \"kubernetes.io/projected/a43f12ab-b64a-4dee-aea6-defd4e555c99-kube-api-access-75n9r\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:08.378994 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.378943 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a43f12ab-b64a-4dee-aea6-defd4e555c99-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:08.379259 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.379237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kserve-provision-location\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.379296 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.379247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-cabundle-cert\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.379333 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.379288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-687035-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-isvc-init-fail-687035-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.389131 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.389107 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm6jt\" (UniqueName: \"kubernetes.io/projected/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kube-api-access-tm6jt\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.882428 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.882386 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.884887 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.884852 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls\") pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:08.888400 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:08.888376 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43f12ab-b64a-4dee-aea6-defd4e555c99" path="/var/lib/kubelet/pods/a43f12ab-b64a-4dee-aea6-defd4e555c99/volumes" Apr 28 19:35:09.063217 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:09.063157 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:09.183165 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:09.183142 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7"] Apr 28 19:35:09.185167 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:35:09.185138 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf050d88_56c8_49d8_b981_4aaf0edfc7cf.slice/crio-673de0c2bd69de6fbfb9765d667c8756a3f179a1385776b6bebbf06c8c49f512 WatchSource:0}: Error finding container 673de0c2bd69de6fbfb9765d667c8756a3f179a1385776b6bebbf06c8c49f512: Status 404 returned error can't find the container with id 673de0c2bd69de6fbfb9765d667c8756a3f179a1385776b6bebbf06c8c49f512 Apr 28 19:35:09.323140 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:09.323101 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" event={"ID":"df050d88-56c8-49d8-b981-4aaf0edfc7cf","Type":"ContainerStarted","Data":"9a58e6a764a6c10635822611a4551dee30fc07943c566cfb2f877ac848600dfd"} Apr 28 19:35:09.323140 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:09.323145 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" event={"ID":"df050d88-56c8-49d8-b981-4aaf0edfc7cf","Type":"ContainerStarted","Data":"673de0c2bd69de6fbfb9765d667c8756a3f179a1385776b6bebbf06c8c49f512"} Apr 28 19:35:10.064268 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:10.064223 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 28 19:35:12.320865 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.320839 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:35:12.332565 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.332533 2572 generic.go:358] "Generic (PLEG): container finished" podID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerID="defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b" exitCode=0 Apr 28 19:35:12.332742 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.332579 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" event={"ID":"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a","Type":"ContainerDied","Data":"defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b"} Apr 28 19:35:12.332742 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.332605 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" Apr 28 19:35:12.332742 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.332613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n" event={"ID":"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a","Type":"ContainerDied","Data":"424e45e1cbd33d0fef352425298214ec512f79203dac8f2868e57f6194f4f47b"} Apr 28 19:35:12.332742 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.332632 2572 scope.go:117] "RemoveContainer" containerID="c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089" Apr 28 19:35:12.339847 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.339826 2572 scope.go:117] "RemoveContainer" containerID="defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b" Apr 28 19:35:12.348454 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.348433 2572 scope.go:117] "RemoveContainer" containerID="7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2" Apr 28 19:35:12.356506 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.356484 2572 scope.go:117] "RemoveContainer" containerID="c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089" Apr 28 19:35:12.356763 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:12.356745 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089\": container with ID starting with c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089 not found: ID does not exist" containerID="c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089" Apr 28 19:35:12.356819 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.356773 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089"} err="failed to get container status \"c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089\": rpc error: code = NotFound desc = could not find container \"c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089\": container with ID starting with c0fde74fe35d974120bf7639ee6e8b32f0841f2e30cc581f231be5b824fe7089 not found: ID does not exist" Apr 28 19:35:12.356819 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.356792 2572 scope.go:117] "RemoveContainer" containerID="defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b" Apr 28 19:35:12.357042 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:12.357022 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b\": container with ID starting with defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b not found: ID does not exist" containerID="defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b" Apr 28 19:35:12.357109 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.357052 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b"} err="failed to get container status \"defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b\": rpc error: code = NotFound desc = could not find container \"defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b\": container with ID starting with defff2cabe3c75e275c28b51964eb8740010819e932a6dc887b965dd94baef6b not found: ID does not exist" Apr 28 19:35:12.357109 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.357074 2572 scope.go:117] "RemoveContainer" containerID="7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2" Apr 28 19:35:12.357304 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:12.357286 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2\": container with ID starting with 7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2 not found: ID does not exist" containerID="7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2" Apr 28 19:35:12.357348 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.357307 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2"} err="failed to get container status \"7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2\": rpc error: code = NotFound desc = could not find container \"7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2\": container with ID starting with 7cb024f00be62cf75f2768e7ce124babf55e2b5ce38a6f0e11fdb39914403ad2 not found: ID does not exist" Apr 28 19:35:12.413682 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.413653 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-proxy-tls\") pod \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " Apr 28 19:35:12.413852 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.413720 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hft2\" (UniqueName: \"kubernetes.io/projected/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kube-api-access-7hft2\") pod \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " Apr 28 19:35:12.413852 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.413750 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\") pod \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " Apr 28 19:35:12.413957 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.413849 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kserve-provision-location\") pod \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\" (UID: \"1ef18fbd-984f-4923-a03a-bdf19cc4cd9a\") " Apr 28 19:35:12.414147 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.414123 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-isvc-primary-b9e6a0-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-b9e6a0-kube-rbac-proxy-sar-config") pod "1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" (UID: "1ef18fbd-984f-4923-a03a-bdf19cc4cd9a"). InnerVolumeSpecName "isvc-primary-b9e6a0-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:35:12.414241 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.414139 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" (UID: "1ef18fbd-984f-4923-a03a-bdf19cc4cd9a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:35:12.415770 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.415747 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" (UID: "1ef18fbd-984f-4923-a03a-bdf19cc4cd9a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:35:12.415936 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.415912 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kube-api-access-7hft2" (OuterVolumeSpecName: "kube-api-access-7hft2") pod "1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" (UID: "1ef18fbd-984f-4923-a03a-bdf19cc4cd9a"). InnerVolumeSpecName "kube-api-access-7hft2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:35:12.514743 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.514694 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:12.514743 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.514743 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:12.514743 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.514756 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7hft2\" (UniqueName: \"kubernetes.io/projected/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-kube-api-access-7hft2\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:12.514975 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.514765 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a-isvc-primary-b9e6a0-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:12.655551 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.655518 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n"] Apr 28 19:35:12.663791 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.663759 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b9e6a0-predictor-5bdbbd7cbd-68l4n"] Apr 28 19:35:12.888910 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:12.888835 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" path="/var/lib/kubelet/pods/1ef18fbd-984f-4923-a03a-bdf19cc4cd9a/volumes" Apr 28 19:35:13.336981 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:13.336943 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_df050d88-56c8-49d8-b981-4aaf0edfc7cf/storage-initializer/0.log" Apr 28 19:35:13.337369 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:13.336991 2572 generic.go:358] "Generic (PLEG): container finished" podID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" containerID="9a58e6a764a6c10635822611a4551dee30fc07943c566cfb2f877ac848600dfd" exitCode=1 Apr 28 19:35:13.337369 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:13.337037 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" event={"ID":"df050d88-56c8-49d8-b981-4aaf0edfc7cf","Type":"ContainerDied","Data":"9a58e6a764a6c10635822611a4551dee30fc07943c566cfb2f877ac848600dfd"} Apr 28 19:35:14.341593 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:14.341565 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_df050d88-56c8-49d8-b981-4aaf0edfc7cf/storage-initializer/0.log" Apr 28 19:35:14.341978 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:14.341666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" event={"ID":"df050d88-56c8-49d8-b981-4aaf0edfc7cf","Type":"ContainerStarted","Data":"9024c98bb9a4d3002db6d4637aa18608cd1e879819eb83bc193adbe289db101e"} Apr 28 19:35:17.351466 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:17.351437 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_df050d88-56c8-49d8-b981-4aaf0edfc7cf/storage-initializer/1.log" Apr 28 19:35:17.351874 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:17.351796 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_df050d88-56c8-49d8-b981-4aaf0edfc7cf/storage-initializer/0.log" Apr 28 19:35:17.351874 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:17.351827 2572 generic.go:358] "Generic (PLEG): container finished" podID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" containerID="9024c98bb9a4d3002db6d4637aa18608cd1e879819eb83bc193adbe289db101e" exitCode=1 Apr 28 19:35:17.351964 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:17.351897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" event={"ID":"df050d88-56c8-49d8-b981-4aaf0edfc7cf","Type":"ContainerDied","Data":"9024c98bb9a4d3002db6d4637aa18608cd1e879819eb83bc193adbe289db101e"} Apr 28 19:35:17.351964 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:17.351935 2572 scope.go:117] "RemoveContainer" containerID="9a58e6a764a6c10635822611a4551dee30fc07943c566cfb2f877ac848600dfd" Apr 28 19:35:17.352314 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:17.352297 2572 scope.go:117] "RemoveContainer" containerID="9a58e6a764a6c10635822611a4551dee30fc07943c566cfb2f877ac848600dfd" Apr 28 19:35:17.361947 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:17.361914 2572 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_kserve-ci-e2e-test_df050d88-56c8-49d8-b981-4aaf0edfc7cf_0 in pod sandbox 673de0c2bd69de6fbfb9765d667c8756a3f179a1385776b6bebbf06c8c49f512 from index: no such id: '9a58e6a764a6c10635822611a4551dee30fc07943c566cfb2f877ac848600dfd'" containerID="9a58e6a764a6c10635822611a4551dee30fc07943c566cfb2f877ac848600dfd" Apr 28 19:35:17.362052 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:17.361964 2572 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_kserve-ci-e2e-test_df050d88-56c8-49d8-b981-4aaf0edfc7cf_0 in pod sandbox 673de0c2bd69de6fbfb9765d667c8756a3f179a1385776b6bebbf06c8c49f512 from index: no such id: '9a58e6a764a6c10635822611a4551dee30fc07943c566cfb2f877ac848600dfd'; Skipping pod \"isvc-init-fail-687035-predictor-857cbdf89c-qnws7_kserve-ci-e2e-test(df050d88-56c8-49d8-b981-4aaf0edfc7cf)\"" logger="UnhandledError" Apr 28 19:35:17.363304 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:17.363286 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-687035-predictor-857cbdf89c-qnws7_kserve-ci-e2e-test(df050d88-56c8-49d8-b981-4aaf0edfc7cf)\"" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" podUID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" Apr 28 19:35:18.148773 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.148741 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7"] Apr 28 19:35:18.254091 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254043 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt"] Apr 28 19:35:18.254359 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254345 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="storage-initializer" Apr 28 19:35:18.254359 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254360 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="storage-initializer" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254373 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a43f12ab-b64a-4dee-aea6-defd4e555c99" containerName="storage-initializer" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254379 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43f12ab-b64a-4dee-aea6-defd4e555c99" containerName="storage-initializer" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254388 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a43f12ab-b64a-4dee-aea6-defd4e555c99" containerName="storage-initializer" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254393 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43f12ab-b64a-4dee-aea6-defd4e555c99" containerName="storage-initializer" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254399 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kube-rbac-proxy" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254403 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kube-rbac-proxy" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254412 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254417 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254456 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a43f12ab-b64a-4dee-aea6-defd4e555c99" containerName="storage-initializer" Apr 28 19:35:18.254463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254462 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kube-rbac-proxy" Apr 28 19:35:18.254727 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254471 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ef18fbd-984f-4923-a03a-bdf19cc4cd9a" containerName="kserve-container" Apr 28 19:35:18.254727 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.254479 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a43f12ab-b64a-4dee-aea6-defd4e555c99" containerName="storage-initializer" Apr 28 19:35:18.258595 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.258574 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.261405 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.261384 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5h9ft\"" Apr 28 19:35:18.261511 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.261423 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-3e269-kube-rbac-proxy-sar-config\"" Apr 28 19:35:18.261511 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.261437 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-3e269-predictor-serving-cert\"" Apr 28 19:35:18.264735 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.264712 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt"] Apr 28 19:35:18.356512 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.356485 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_df050d88-56c8-49d8-b981-4aaf0edfc7cf/storage-initializer/1.log" Apr 28 19:35:18.358231 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.358202 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kserve-provision-location\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.358330 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.358241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-3e269-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a799eefe-a5f4-45cb-af20-50c943b0b9e4-raw-sklearn-3e269-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.358330 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.358270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k5c7\" (UniqueName: \"kubernetes.io/projected/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kube-api-access-2k5c7\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.358463 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.358355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.459742 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.459709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.459926 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.459756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kserve-provision-location\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.459926 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.459787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-3e269-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a799eefe-a5f4-45cb-af20-50c943b0b9e4-raw-sklearn-3e269-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.459926 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.459830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k5c7\" (UniqueName: \"kubernetes.io/projected/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kube-api-access-2k5c7\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.459926 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:18.459886 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-3e269-predictor-serving-cert: secret "raw-sklearn-3e269-predictor-serving-cert" not found Apr 28 19:35:18.460200 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:35:18.459959 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls podName:a799eefe-a5f4-45cb-af20-50c943b0b9e4 nodeName:}" failed. No retries permitted until 2026-04-28 19:35:18.95993436 +0000 UTC m=+1160.632379441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls") pod "raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" (UID: "a799eefe-a5f4-45cb-af20-50c943b0b9e4") : secret "raw-sklearn-3e269-predictor-serving-cert" not found Apr 28 19:35:18.460273 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.460210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kserve-provision-location\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.460710 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.460688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-3e269-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a799eefe-a5f4-45cb-af20-50c943b0b9e4-raw-sklearn-3e269-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.469135 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.469113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k5c7\" (UniqueName: \"kubernetes.io/projected/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kube-api-access-2k5c7\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.483057 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.483036 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_df050d88-56c8-49d8-b981-4aaf0edfc7cf/storage-initializer/1.log" Apr 28 19:35:18.483167 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.483100 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:18.560617 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.560586 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm6jt\" (UniqueName: \"kubernetes.io/projected/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kube-api-access-tm6jt\") pod \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " Apr 28 19:35:18.560787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.560678 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-cabundle-cert\") pod \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " Apr 28 19:35:18.560787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.560702 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls\") pod \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " Apr 28 19:35:18.560787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.560723 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kserve-provision-location\") pod \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " Apr 28 19:35:18.560787 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.560751 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-687035-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-isvc-init-fail-687035-kube-rbac-proxy-sar-config\") pod \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\" (UID: \"df050d88-56c8-49d8-b981-4aaf0edfc7cf\") " Apr 28 19:35:18.561022 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.560974 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df050d88-56c8-49d8-b981-4aaf0edfc7cf" (UID: "df050d88-56c8-49d8-b981-4aaf0edfc7cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:35:18.561077 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.561040 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "df050d88-56c8-49d8-b981-4aaf0edfc7cf" (UID: "df050d88-56c8-49d8-b981-4aaf0edfc7cf"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:35:18.561316 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.561287 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-isvc-init-fail-687035-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-687035-kube-rbac-proxy-sar-config") pod "df050d88-56c8-49d8-b981-4aaf0edfc7cf" (UID: "df050d88-56c8-49d8-b981-4aaf0edfc7cf"). InnerVolumeSpecName "isvc-init-fail-687035-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:35:18.562834 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.562808 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kube-api-access-tm6jt" (OuterVolumeSpecName: "kube-api-access-tm6jt") pod "df050d88-56c8-49d8-b981-4aaf0edfc7cf" (UID: "df050d88-56c8-49d8-b981-4aaf0edfc7cf"). InnerVolumeSpecName "kube-api-access-tm6jt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:35:18.562834 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.562821 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "df050d88-56c8-49d8-b981-4aaf0edfc7cf" (UID: "df050d88-56c8-49d8-b981-4aaf0edfc7cf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:35:18.662388 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.662306 2572 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-cabundle-cert\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:18.662388 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.662334 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df050d88-56c8-49d8-b981-4aaf0edfc7cf-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:18.662388 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.662345 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:18.662388 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.662362 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-687035-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df050d88-56c8-49d8-b981-4aaf0edfc7cf-isvc-init-fail-687035-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:18.662388 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.662375 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tm6jt\" (UniqueName: \"kubernetes.io/projected/df050d88-56c8-49d8-b981-4aaf0edfc7cf-kube-api-access-tm6jt\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:35:18.964440 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.964408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:18.966773 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:18.966749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls\") pod \"raw-sklearn-3e269-predictor-75dc5c954c-zt7nt\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:19.169462 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.169423 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:19.286903 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.286862 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt"] Apr 28 19:35:19.290602 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:35:19.290576 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda799eefe_a5f4_45cb_af20_50c943b0b9e4.slice/crio-42c7e43368e0f8f268ed4668868f9fec88fbbac53bfd045c9ba2badbfb1b1bef WatchSource:0}: Error finding container 42c7e43368e0f8f268ed4668868f9fec88fbbac53bfd045c9ba2badbfb1b1bef: Status 404 returned error can't find the container with id 42c7e43368e0f8f268ed4668868f9fec88fbbac53bfd045c9ba2badbfb1b1bef Apr 28 19:35:19.360470 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.360445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-687035-predictor-857cbdf89c-qnws7_df050d88-56c8-49d8-b981-4aaf0edfc7cf/storage-initializer/1.log" Apr 28 19:35:19.360890 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.360530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" event={"ID":"df050d88-56c8-49d8-b981-4aaf0edfc7cf","Type":"ContainerDied","Data":"673de0c2bd69de6fbfb9765d667c8756a3f179a1385776b6bebbf06c8c49f512"} Apr 28 19:35:19.360890 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.360568 2572 scope.go:117] "RemoveContainer" containerID="9024c98bb9a4d3002db6d4637aa18608cd1e879819eb83bc193adbe289db101e" Apr 28 19:35:19.360890 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.360576 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7" Apr 28 19:35:19.362576 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.362047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" event={"ID":"a799eefe-a5f4-45cb-af20-50c943b0b9e4","Type":"ContainerStarted","Data":"f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe"} Apr 28 19:35:19.362576 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.362082 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" event={"ID":"a799eefe-a5f4-45cb-af20-50c943b0b9e4","Type":"ContainerStarted","Data":"42c7e43368e0f8f268ed4668868f9fec88fbbac53bfd045c9ba2badbfb1b1bef"} Apr 28 19:35:19.393666 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.393630 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7"] Apr 28 19:35:19.397236 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:19.397193 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-687035-predictor-857cbdf89c-qnws7"] Apr 28 19:35:20.888848 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:20.888815 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" path="/var/lib/kubelet/pods/df050d88-56c8-49d8-b981-4aaf0edfc7cf/volumes" Apr 28 19:35:23.373899 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:23.373817 2572 generic.go:358] "Generic (PLEG): container finished" podID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerID="f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe" exitCode=0 Apr 28 19:35:23.373899 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:23.373874 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" event={"ID":"a799eefe-a5f4-45cb-af20-50c943b0b9e4","Type":"ContainerDied","Data":"f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe"} Apr 28 19:35:24.378806 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:24.378773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" event={"ID":"a799eefe-a5f4-45cb-af20-50c943b0b9e4","Type":"ContainerStarted","Data":"0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5"} Apr 28 19:35:24.378806 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:24.378808 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" event={"ID":"a799eefe-a5f4-45cb-af20-50c943b0b9e4","Type":"ContainerStarted","Data":"6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2"} Apr 28 19:35:24.379268 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:24.379092 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:24.379268 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:24.379205 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:24.380308 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:24.380284 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:35:24.399552 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:24.399503 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podStartSLOduration=6.399488337 podStartE2EDuration="6.399488337s" podCreationTimestamp="2026-04-28 19:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:35:24.398311601 +0000 UTC m=+1166.070756706" watchObservedRunningTime="2026-04-28 19:35:24.399488337 +0000 UTC m=+1166.071933436" Apr 28 19:35:25.381390 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:25.381356 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:35:30.386278 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:30.386250 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:35:30.386884 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:30.386855 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:35:40.386883 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:40.386838 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:35:50.387613 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:35:50.387572 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:36:00.386846 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:00.386794 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:36:10.387450 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:10.387409 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:36:20.386773 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:20.386725 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:36:30.388320 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:30.388289 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:36:39.077365 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.077331 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt"] Apr 28 19:36:39.077747 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.077637 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" containerID="cri-o://6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2" gracePeriod=30 Apr 28 19:36:39.077747 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.077672 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kube-rbac-proxy" containerID="cri-o://0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5" gracePeriod=30 Apr 28 19:36:39.288797 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.288762 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf"] Apr 28 19:36:39.289082 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.289069 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" containerName="storage-initializer" Apr 28 19:36:39.289135 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.289084 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" containerName="storage-initializer" Apr 28 19:36:39.289194 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.289145 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" containerName="storage-initializer" Apr 28 19:36:39.289237 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.289212 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" containerName="storage-initializer" Apr 28 19:36:39.289237 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.289220 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" containerName="storage-initializer" Apr 28 19:36:39.289305 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.289265 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="df050d88-56c8-49d8-b981-4aaf0edfc7cf" containerName="storage-initializer" Apr 28 19:36:39.292118 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.292101 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.294960 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.294939 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\"" Apr 28 19:36:39.295068 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.294973 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-cdca2-predictor-serving-cert\"" Apr 28 19:36:39.304810 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.304785 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf"] Apr 28 19:36:39.369956 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.369869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbc6869-5623-45de-a9f8-e60c722b36a7-kserve-provision-location\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.369956 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.369911 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8mzx\" (UniqueName: \"kubernetes.io/projected/1bbc6869-5623-45de-a9f8-e60c722b36a7-kube-api-access-q8mzx\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.369956 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.369937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bbc6869-5623-45de-a9f8-e60c722b36a7-proxy-tls\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.370245 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.370068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bbc6869-5623-45de-a9f8-e60c722b36a7-raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.470757 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.470717 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bbc6869-5623-45de-a9f8-e60c722b36a7-proxy-tls\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.470912 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.470813 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bbc6869-5623-45de-a9f8-e60c722b36a7-raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.470912 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.470872 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbc6869-5623-45de-a9f8-e60c722b36a7-kserve-provision-location\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.470912 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.470909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8mzx\" (UniqueName: \"kubernetes.io/projected/1bbc6869-5623-45de-a9f8-e60c722b36a7-kube-api-access-q8mzx\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.471345 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.471320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbc6869-5623-45de-a9f8-e60c722b36a7-kserve-provision-location\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.471535 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.471516 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bbc6869-5623-45de-a9f8-e60c722b36a7-raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.473130 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.473112 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bbc6869-5623-45de-a9f8-e60c722b36a7-proxy-tls\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.478808 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.478786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8mzx\" (UniqueName: \"kubernetes.io/projected/1bbc6869-5623-45de-a9f8-e60c722b36a7-kube-api-access-q8mzx\") pod \"raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.586921 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.586888 2572 generic.go:358] "Generic (PLEG): container finished" podID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerID="0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5" exitCode=2 Apr 28 19:36:39.587092 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.586967 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" event={"ID":"a799eefe-a5f4-45cb-af20-50c943b0b9e4","Type":"ContainerDied","Data":"0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5"} Apr 28 19:36:39.601529 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.601510 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:39.754242 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:39.754214 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf"] Apr 28 19:36:39.756509 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:36:39.756477 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bbc6869_5623_45de_a9f8_e60c722b36a7.slice/crio-5ff2244346a5846290ca5ea03af8be0c9a0690198d7dd55aea1f61177efb2f3c WatchSource:0}: Error finding container 5ff2244346a5846290ca5ea03af8be0c9a0690198d7dd55aea1f61177efb2f3c: Status 404 returned error can't find the container with id 5ff2244346a5846290ca5ea03af8be0c9a0690198d7dd55aea1f61177efb2f3c Apr 28 19:36:40.382148 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:40.382101 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 28 19:36:40.387487 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:40.387453 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 28 19:36:40.591406 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:40.591370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" event={"ID":"1bbc6869-5623-45de-a9f8-e60c722b36a7","Type":"ContainerStarted","Data":"15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272"} Apr 28 19:36:40.591406 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:40.591409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" event={"ID":"1bbc6869-5623-45de-a9f8-e60c722b36a7","Type":"ContainerStarted","Data":"5ff2244346a5846290ca5ea03af8be0c9a0690198d7dd55aea1f61177efb2f3c"} Apr 28 19:36:43.516516 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.516493 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:36:43.600820 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.600784 2572 generic.go:358] "Generic (PLEG): container finished" podID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerID="6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2" exitCode=0 Apr 28 19:36:43.601004 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.600857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" event={"ID":"a799eefe-a5f4-45cb-af20-50c943b0b9e4","Type":"ContainerDied","Data":"6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2"} Apr 28 19:36:43.601004 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.600890 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" event={"ID":"a799eefe-a5f4-45cb-af20-50c943b0b9e4","Type":"ContainerDied","Data":"42c7e43368e0f8f268ed4668868f9fec88fbbac53bfd045c9ba2badbfb1b1bef"} Apr 28 19:36:43.601004 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.600909 2572 scope.go:117] "RemoveContainer" containerID="0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5" Apr 28 19:36:43.601004 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.600861 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt" Apr 28 19:36:43.607205 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.607161 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k5c7\" (UniqueName: \"kubernetes.io/projected/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kube-api-access-2k5c7\") pod \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " Apr 28 19:36:43.607325 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.607235 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-3e269-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a799eefe-a5f4-45cb-af20-50c943b0b9e4-raw-sklearn-3e269-kube-rbac-proxy-sar-config\") pod \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " Apr 28 19:36:43.607406 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.607388 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kserve-provision-location\") pod \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " Apr 28 19:36:43.607454 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.607428 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls\") pod \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\" (UID: \"a799eefe-a5f4-45cb-af20-50c943b0b9e4\") " Apr 28 19:36:43.607567 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.607547 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a799eefe-a5f4-45cb-af20-50c943b0b9e4-raw-sklearn-3e269-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-3e269-kube-rbac-proxy-sar-config") pod "a799eefe-a5f4-45cb-af20-50c943b0b9e4" (UID: "a799eefe-a5f4-45cb-af20-50c943b0b9e4"). InnerVolumeSpecName "raw-sklearn-3e269-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:36:43.607768 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.607725 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a799eefe-a5f4-45cb-af20-50c943b0b9e4" (UID: "a799eefe-a5f4-45cb-af20-50c943b0b9e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:36:43.607916 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.607896 2572 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-3e269-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a799eefe-a5f4-45cb-af20-50c943b0b9e4-raw-sklearn-3e269-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:36:43.607978 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.607922 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:36:43.609454 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.609423 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a799eefe-a5f4-45cb-af20-50c943b0b9e4" (UID: "a799eefe-a5f4-45cb-af20-50c943b0b9e4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:36:43.609725 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.609696 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kube-api-access-2k5c7" (OuterVolumeSpecName: "kube-api-access-2k5c7") pod "a799eefe-a5f4-45cb-af20-50c943b0b9e4" (UID: "a799eefe-a5f4-45cb-af20-50c943b0b9e4"). InnerVolumeSpecName "kube-api-access-2k5c7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:36:43.609725 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.609720 2572 scope.go:117] "RemoveContainer" containerID="6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2" Apr 28 19:36:43.622376 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.622353 2572 scope.go:117] "RemoveContainer" containerID="f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe" Apr 28 19:36:43.630607 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.630590 2572 scope.go:117] "RemoveContainer" containerID="0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5" Apr 28 19:36:43.630898 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:36:43.630877 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5\": container with ID starting with 0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5 not found: ID does not exist" containerID="0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5" Apr 28 19:36:43.630964 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.630905 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5"} err="failed to get container status \"0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5\": rpc error: code = NotFound desc = could not find container \"0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5\": container with ID starting with 0ce987df1b22416136da0e082ef62db713e95aa4958ac38c67b4bd831d5a5fc5 not found: ID does not exist" Apr 28 19:36:43.630964 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.630926 2572 scope.go:117] "RemoveContainer" containerID="6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2" Apr 28 19:36:43.631359 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:36:43.631181 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2\": container with ID starting with 6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2 not found: ID does not exist" containerID="6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2" Apr 28 19:36:43.631359 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.631219 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2"} err="failed to get container status \"6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2\": rpc error: code = NotFound desc = could not find container \"6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2\": container with ID starting with 6e93b48635f65a2d93e330eaca2b6943472e974d188d27f9ee0b6fd5b0222cc2 not found: ID does not exist" Apr 28 19:36:43.631359 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.631236 2572 scope.go:117] "RemoveContainer" containerID="f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe" Apr 28 19:36:43.631532 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:36:43.631469 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe\": container with ID starting with f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe not found: ID does not exist" containerID="f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe" Apr 28 19:36:43.631532 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.631487 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe"} err="failed to get container status \"f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe\": rpc error: code = NotFound desc = could not find container \"f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe\": container with ID starting with f27b9af8c79fe504944b2b37f7cf16c923401de9598b8d6b6f43a889a7f58dbe not found: ID does not exist" Apr 28 19:36:43.708835 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.708782 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2k5c7\" (UniqueName: \"kubernetes.io/projected/a799eefe-a5f4-45cb-af20-50c943b0b9e4-kube-api-access-2k5c7\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:36:43.708835 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.708821 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a799eefe-a5f4-45cb-af20-50c943b0b9e4-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:36:43.924625 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.924550 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt"] Apr 28 19:36:43.928986 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:43.928959 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-3e269-predictor-75dc5c954c-zt7nt"] Apr 28 19:36:44.605756 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:44.605721 2572 generic.go:358] "Generic (PLEG): container finished" podID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerID="15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272" exitCode=0 Apr 28 19:36:44.605756 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:44.605771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" event={"ID":"1bbc6869-5623-45de-a9f8-e60c722b36a7","Type":"ContainerDied","Data":"15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272"} Apr 28 19:36:44.889046 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:44.888970 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" path="/var/lib/kubelet/pods/a799eefe-a5f4-45cb-af20-50c943b0b9e4/volumes" Apr 28 19:36:45.610079 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:45.610045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" event={"ID":"1bbc6869-5623-45de-a9f8-e60c722b36a7","Type":"ContainerStarted","Data":"0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c"} Apr 28 19:36:45.610079 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:45.610081 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" event={"ID":"1bbc6869-5623-45de-a9f8-e60c722b36a7","Type":"ContainerStarted","Data":"6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6"} Apr 28 19:36:45.610587 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:45.610437 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:45.610587 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:45.610566 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:45.611900 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:45.611872 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:36:45.629638 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:45.629592 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podStartSLOduration=6.629577597 podStartE2EDuration="6.629577597s" podCreationTimestamp="2026-04-28 19:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:36:45.62778061 +0000 UTC m=+1247.300225724" watchObservedRunningTime="2026-04-28 19:36:45.629577597 +0000 UTC m=+1247.302022770" Apr 28 19:36:46.613479 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:46.613439 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:36:51.617373 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:51.617344 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:36:51.617898 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:36:51.617872 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:37:01.618628 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:01.618582 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:37:11.618677 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:11.618629 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:37:21.618606 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:21.618565 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:37:31.618648 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:31.618602 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:37:41.618050 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:41.618015 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:37:51.618771 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:51.618741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:37:58.570244 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:58.570212 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf"] Apr 28 19:37:58.570705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:58.570618 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" containerID="cri-o://6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6" gracePeriod=30 Apr 28 19:37:58.570705 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:58.570655 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kube-rbac-proxy" containerID="cri-o://0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c" gracePeriod=30 Apr 28 19:37:58.816673 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:58.816641 2572 generic.go:358] "Generic (PLEG): container finished" podID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerID="0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c" exitCode=2 Apr 28 19:37:58.816827 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:37:58.816700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" event={"ID":"1bbc6869-5623-45de-a9f8-e60c722b36a7","Type":"ContainerDied","Data":"0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c"} Apr 28 19:38:01.614354 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:01.614313 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 28 19:38:01.618593 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:01.618571 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 28 19:38:02.610206 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.610163 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:38:02.686006 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.685977 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8mzx\" (UniqueName: \"kubernetes.io/projected/1bbc6869-5623-45de-a9f8-e60c722b36a7-kube-api-access-q8mzx\") pod \"1bbc6869-5623-45de-a9f8-e60c722b36a7\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " Apr 28 19:38:02.686456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.686029 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbc6869-5623-45de-a9f8-e60c722b36a7-kserve-provision-location\") pod \"1bbc6869-5623-45de-a9f8-e60c722b36a7\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " Apr 28 19:38:02.686456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.686053 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bbc6869-5623-45de-a9f8-e60c722b36a7-raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\") pod \"1bbc6869-5623-45de-a9f8-e60c722b36a7\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " Apr 28 19:38:02.686456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.686222 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bbc6869-5623-45de-a9f8-e60c722b36a7-proxy-tls\") pod \"1bbc6869-5623-45de-a9f8-e60c722b36a7\" (UID: \"1bbc6869-5623-45de-a9f8-e60c722b36a7\") " Apr 28 19:38:02.686456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.686354 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bbc6869-5623-45de-a9f8-e60c722b36a7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1bbc6869-5623-45de-a9f8-e60c722b36a7" (UID: "1bbc6869-5623-45de-a9f8-e60c722b36a7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:38:02.686456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.686362 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bbc6869-5623-45de-a9f8-e60c722b36a7-raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config") pod "1bbc6869-5623-45de-a9f8-e60c722b36a7" (UID: "1bbc6869-5623-45de-a9f8-e60c722b36a7"). InnerVolumeSpecName "raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:38:02.686456 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.686455 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bbc6869-5623-45de-a9f8-e60c722b36a7-kserve-provision-location\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:38:02.688154 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.688131 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bbc6869-5623-45de-a9f8-e60c722b36a7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1bbc6869-5623-45de-a9f8-e60c722b36a7" (UID: "1bbc6869-5623-45de-a9f8-e60c722b36a7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:38:02.688407 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.688201 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bbc6869-5623-45de-a9f8-e60c722b36a7-kube-api-access-q8mzx" (OuterVolumeSpecName: "kube-api-access-q8mzx") pod "1bbc6869-5623-45de-a9f8-e60c722b36a7" (UID: "1bbc6869-5623-45de-a9f8-e60c722b36a7"). InnerVolumeSpecName "kube-api-access-q8mzx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:38:02.786759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.786694 2572 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bbc6869-5623-45de-a9f8-e60c722b36a7-raw-sklearn-runtime-cdca2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:38:02.786759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.786718 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bbc6869-5623-45de-a9f8-e60c722b36a7-proxy-tls\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:38:02.786759 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.786730 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8mzx\" (UniqueName: \"kubernetes.io/projected/1bbc6869-5623-45de-a9f8-e60c722b36a7-kube-api-access-q8mzx\") on node \"ip-10-0-134-36.ec2.internal\" DevicePath \"\"" Apr 28 19:38:02.829408 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.829374 2572 generic.go:358] "Generic (PLEG): container finished" podID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerID="6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6" exitCode=0 Apr 28 19:38:02.829552 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.829415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" event={"ID":"1bbc6869-5623-45de-a9f8-e60c722b36a7","Type":"ContainerDied","Data":"6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6"} Apr 28 19:38:02.829552 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.829444 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" event={"ID":"1bbc6869-5623-45de-a9f8-e60c722b36a7","Type":"ContainerDied","Data":"5ff2244346a5846290ca5ea03af8be0c9a0690198d7dd55aea1f61177efb2f3c"} Apr 28 19:38:02.829552 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.829451 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf" Apr 28 19:38:02.829552 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.829462 2572 scope.go:117] "RemoveContainer" containerID="0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c" Apr 28 19:38:02.837708 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.837584 2572 scope.go:117] "RemoveContainer" containerID="6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6" Apr 28 19:38:02.844146 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.844127 2572 scope.go:117] "RemoveContainer" containerID="15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272" Apr 28 19:38:02.850334 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.850311 2572 scope.go:117] "RemoveContainer" containerID="0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c" Apr 28 19:38:02.850563 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:38:02.850545 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c\": container with ID starting with 0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c not found: ID does not exist" containerID="0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c" Apr 28 19:38:02.850633 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.850570 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c"} err="failed to get container status \"0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c\": rpc error: code = NotFound desc = could not find container \"0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c\": container with ID starting with 0ce1b95ff31b14f14ac9891d7d4681857e5b91886f4e206a34339a3fb912368c not found: ID does not exist" Apr 28 19:38:02.850633 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.850587 2572 scope.go:117] "RemoveContainer" containerID="6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6" Apr 28 19:38:02.850787 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:38:02.850770 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6\": container with ID starting with 6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6 not found: ID does not exist" containerID="6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6" Apr 28 19:38:02.850836 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.850791 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6"} err="failed to get container status \"6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6\": rpc error: code = NotFound desc = could not find container \"6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6\": container with ID starting with 6183c3b889059b21e2fcf1ff128c6dfd2fe4319c84bdd6e8ecc6de6b0edf68c6 not found: ID does not exist" Apr 28 19:38:02.850836 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.850804 2572 scope.go:117] "RemoveContainer" containerID="15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272" Apr 28 19:38:02.853625 ip-10-0-134-36 kubenswrapper[2572]: E0428 19:38:02.851402 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272\": container with ID starting with 15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272 not found: ID does not exist" containerID="15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272" Apr 28 19:38:02.853625 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.851430 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272"} err="failed to get container status \"15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272\": rpc error: code = NotFound desc = could not find container \"15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272\": container with ID starting with 15306ead3d1aa85281b4b7cf62cdec8ccb87569dd1d093639996982c8731c272 not found: ID does not exist" Apr 28 19:38:02.853625 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.851663 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf"] Apr 28 19:38:02.859240 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.859220 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-cdca2-predictor-65dc66dfd5-gj7gf"] Apr 28 19:38:02.889399 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:02.889376 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" path="/var/lib/kubelet/pods/1bbc6869-5623-45de-a9f8-e60c722b36a7/volumes" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.770563 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rqj59/must-gather-bxq66"] Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771137 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="storage-initializer" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771155 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="storage-initializer" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771197 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771206 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771216 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kube-rbac-proxy" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771225 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kube-rbac-proxy" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771241 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771249 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771259 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="storage-initializer" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771267 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="storage-initializer" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771288 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kube-rbac-proxy" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771296 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kube-rbac-proxy" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771403 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kube-rbac-proxy" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771423 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kube-rbac-proxy" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771434 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bbc6869-5623-45de-a9f8-e60c722b36a7" containerName="kserve-container" Apr 28 19:38:23.772877 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.771451 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a799eefe-a5f4-45cb-af20-50c943b0b9e4" containerName="kserve-container" Apr 28 19:38:23.775951 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.775925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rqj59/must-gather-bxq66" Apr 28 19:38:23.778556 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.778534 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rqj59\"/\"openshift-service-ca.crt\"" Apr 28 19:38:23.778706 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.778569 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rqj59\"/\"kube-root-ca.crt\"" Apr 28 19:38:23.778706 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.778641 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rqj59\"/\"default-dockercfg-bqjfr\"" Apr 28 19:38:23.781157 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.781136 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rqj59/must-gather-bxq66"] Apr 28 19:38:23.947647 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.947607 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qsx5\" (UniqueName: \"kubernetes.io/projected/37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf-kube-api-access-5qsx5\") pod \"must-gather-bxq66\" (UID: \"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf\") " pod="openshift-must-gather-rqj59/must-gather-bxq66" Apr 28 19:38:23.947647 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:23.947650 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf-must-gather-output\") pod \"must-gather-bxq66\" (UID: \"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf\") " pod="openshift-must-gather-rqj59/must-gather-bxq66" Apr 28 19:38:24.048979 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:24.048890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qsx5\" (UniqueName: \"kubernetes.io/projected/37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf-kube-api-access-5qsx5\") pod \"must-gather-bxq66\" (UID: \"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf\") " pod="openshift-must-gather-rqj59/must-gather-bxq66" Apr 28 19:38:24.048979 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:24.048941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf-must-gather-output\") pod \"must-gather-bxq66\" (UID: \"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf\") " pod="openshift-must-gather-rqj59/must-gather-bxq66" Apr 28 19:38:24.049347 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:24.049329 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf-must-gather-output\") pod \"must-gather-bxq66\" (UID: \"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf\") " pod="openshift-must-gather-rqj59/must-gather-bxq66" Apr 28 19:38:24.057576 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:24.057546 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qsx5\" (UniqueName: \"kubernetes.io/projected/37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf-kube-api-access-5qsx5\") pod \"must-gather-bxq66\" (UID: \"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf\") " pod="openshift-must-gather-rqj59/must-gather-bxq66" Apr 28 19:38:24.085856 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:24.085823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rqj59/must-gather-bxq66" Apr 28 19:38:24.204741 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:24.204707 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rqj59/must-gather-bxq66"] Apr 28 19:38:24.208097 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:38:24.208069 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37b8e78e_2bbf_4ae5_8886_bd09cb4e19bf.slice/crio-3d8e0def045f28c90ff4c3177575dd56788f987e2e30b9d7df996de6419f21b5 WatchSource:0}: Error finding container 3d8e0def045f28c90ff4c3177575dd56788f987e2e30b9d7df996de6419f21b5: Status 404 returned error can't find the container with id 3d8e0def045f28c90ff4c3177575dd56788f987e2e30b9d7df996de6419f21b5 Apr 28 19:38:24.892758 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:24.892717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/must-gather-bxq66" event={"ID":"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf","Type":"ContainerStarted","Data":"3d8e0def045f28c90ff4c3177575dd56788f987e2e30b9d7df996de6419f21b5"} Apr 28 19:38:25.898274 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:25.897765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/must-gather-bxq66" event={"ID":"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf","Type":"ContainerStarted","Data":"f6d742f8411216f92f83c01280a28e2c83bd33a8c3ce678729d6c57a3dddd921"} Apr 28 19:38:25.898274 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:25.897830 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/must-gather-bxq66" event={"ID":"37b8e78e-2bbf-4ae5-8886-bd09cb4e19bf","Type":"ContainerStarted","Data":"4b31eb023f4476ad362300d66c6e83aab5a845d4876f220dda612da10c244c20"} Apr 28 19:38:25.914710 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:25.914655 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rqj59/must-gather-bxq66" podStartSLOduration=2.084980227 podStartE2EDuration="2.914638369s" podCreationTimestamp="2026-04-28 19:38:23 +0000 UTC" firstStartedPulling="2026-04-28 19:38:24.209705222 +0000 UTC m=+1345.882150304" lastFinishedPulling="2026-04-28 19:38:25.039363347 +0000 UTC m=+1346.711808446" observedRunningTime="2026-04-28 19:38:25.912837587 +0000 UTC m=+1347.585282703" watchObservedRunningTime="2026-04-28 19:38:25.914638369 +0000 UTC m=+1347.587083473" Apr 28 19:38:26.430590 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:26.430551 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mdlds_98dda8bf-ccac-4d10-a55e-8d1b2d3121b1/global-pull-secret-syncer/0.log" Apr 28 19:38:26.636081 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:26.635905 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tp2kj_704ee9d6-f870-489e-8d81-6488fb22d8be/konnectivity-agent/0.log" Apr 28 19:38:26.678838 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:26.678810 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-36.ec2.internal_c871a044bef489e8809d7f6e127e2847/haproxy/0.log" Apr 28 19:38:30.239948 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:30.239912 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-d558fffc9-9lkpp_9b8f93a4-9552-476f-b918-baf1e61ac066/metrics-server/0.log" Apr 28 19:38:30.299831 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:30.299767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mgnd_c766c71a-a924-40ab-bc77-ac0e493e0671/node-exporter/0.log" Apr 28 19:38:30.324963 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:30.324873 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mgnd_c766c71a-a924-40ab-bc77-ac0e493e0671/kube-rbac-proxy/0.log" Apr 28 19:38:30.352505 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:30.352477 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4mgnd_c766c71a-a924-40ab-bc77-ac0e493e0671/init-textfile/0.log" Apr 28 19:38:30.825525 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:30.825463 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bcbf69d6c-7952f_dffa9e69-20fa-40fa-897f-dd771bc75935/telemeter-client/0.log" Apr 28 19:38:30.849433 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:30.849400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bcbf69d6c-7952f_dffa9e69-20fa-40fa-897f-dd771bc75935/reload/0.log" Apr 28 19:38:30.875585 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:30.875545 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bcbf69d6c-7952f_dffa9e69-20fa-40fa-897f-dd771bc75935/kube-rbac-proxy/0.log" Apr 28 19:38:33.568275 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.568238 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml"] Apr 28 19:38:33.572955 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.572924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.578308 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.578279 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml"] Apr 28 19:38:33.643113 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.643072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-proc\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.643113 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.643112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-sys\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.643379 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.643143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2xl\" (UniqueName: \"kubernetes.io/projected/612d8596-d823-477f-ad98-edbdf1725c7d-kube-api-access-8c2xl\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.643379 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.643220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-podres\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.643379 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.643255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-lib-modules\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.744727 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.744689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2xl\" (UniqueName: \"kubernetes.io/projected/612d8596-d823-477f-ad98-edbdf1725c7d-kube-api-access-8c2xl\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.745000 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.744979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-podres\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.745149 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.745133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-lib-modules\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.745253 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.745139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-podres\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.745322 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.745260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-lib-modules\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.745400 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.745382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-proc\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.745458 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.745416 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-sys\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.745508 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.745465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-proc\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.745556 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.745517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/612d8596-d823-477f-ad98-edbdf1725c7d-sys\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.753969 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.753941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2xl\" (UniqueName: \"kubernetes.io/projected/612d8596-d823-477f-ad98-edbdf1725c7d-kube-api-access-8c2xl\") pod \"perf-node-gather-daemonset-pv4ml\" (UID: \"612d8596-d823-477f-ad98-edbdf1725c7d\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:33.886566 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:33.886460 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:34.031874 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.031845 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml"] Apr 28 19:38:34.036080 ip-10-0-134-36 kubenswrapper[2572]: W0428 19:38:34.036048 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod612d8596_d823_477f_ad98_edbdf1725c7d.slice/crio-eb5328cd0ebfa71fc6f270252622d49651b4e2a3b69ee3231e5fd2c6586e6808 WatchSource:0}: Error finding container eb5328cd0ebfa71fc6f270252622d49651b4e2a3b69ee3231e5fd2c6586e6808: Status 404 returned error can't find the container with id eb5328cd0ebfa71fc6f270252622d49651b4e2a3b69ee3231e5fd2c6586e6808 Apr 28 19:38:34.038048 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.038030 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:38:34.128128 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.128097 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xcxrp_a9198734-608e-4a54-8ac4-e6b0cefdc390/dns/0.log" Apr 28 19:38:34.150344 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.150277 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xcxrp_a9198734-608e-4a54-8ac4-e6b0cefdc390/kube-rbac-proxy/0.log" Apr 28 19:38:34.219896 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.219870 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vzbxw_ccc8edc1-f281-4f6c-b9d7-56c52685d934/dns-node-resolver/0.log" Apr 28 19:38:34.604550 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.604519 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-549d5fc798-r44zx_6b94e1fb-0c0f-4aeb-a297-100d7d91353d/registry/0.log" Apr 28 19:38:34.668437 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.668402 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wtz8s_b959573c-d823-4400-ac66-5f111c6ec711/node-ca/0.log" Apr 28 19:38:34.930060 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.929974 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" event={"ID":"612d8596-d823-477f-ad98-edbdf1725c7d","Type":"ContainerStarted","Data":"a0b821f625391a6094f834fb2835fba622bb34467d8e45a34e531c6cf3565dce"} Apr 28 19:38:34.930060 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.930012 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" event={"ID":"612d8596-d823-477f-ad98-edbdf1725c7d","Type":"ContainerStarted","Data":"eb5328cd0ebfa71fc6f270252622d49651b4e2a3b69ee3231e5fd2c6586e6808"} Apr 28 19:38:34.930060 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.930045 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:34.946631 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:34.946576 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" podStartSLOduration=1.946557178 podStartE2EDuration="1.946557178s" podCreationTimestamp="2026-04-28 19:38:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:38:34.9447094 +0000 UTC m=+1356.617154529" watchObservedRunningTime="2026-04-28 19:38:34.946557178 +0000 UTC m=+1356.619002283" Apr 28 19:38:35.728406 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:35.728371 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zv5k4_847fde1a-8b63-481f-998b-c119fa746ad5/serve-healthcheck-canary/0.log" Apr 28 19:38:36.105310 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:36.105236 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4m8n9_dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea/kube-rbac-proxy/0.log" Apr 28 19:38:36.127266 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:36.127237 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4m8n9_dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea/exporter/0.log" Apr 28 19:38:36.148856 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:36.148826 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4m8n9_dd8f41b0-57cb-4aa6-8b24-91e2ea5466ea/extractor/0.log" Apr 28 19:38:38.499088 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:38.499057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-mx5cm_15442544-6447-448d-9c8d-67ac00ac5bc8/server/0.log" Apr 28 19:38:39.274602 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:39.274545 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-gcsqh_dc1e6ebe-07bc-4bb8-be03-ec0b5719f1ce/manager/0.log" Apr 28 19:38:39.775650 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:39.775610 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-kwt2h_fed37cca-a5e0-405d-bdee-c9bb70721562/seaweedfs/0.log" Apr 28 19:38:40.943613 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:40.943584 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-pv4ml" Apr 28 19:38:45.025764 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.025730 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-58mkl_5369f9de-bd9f-4ef6-877b-ef3932a99bd9/kube-multus/0.log" Apr 28 19:38:45.082997 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.082970 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvlw8_33cdcd08-3e9e-4229-b74c-2e99bdeb2074/kube-multus-additional-cni-plugins/0.log" Apr 28 19:38:45.106636 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.106565 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvlw8_33cdcd08-3e9e-4229-b74c-2e99bdeb2074/egress-router-binary-copy/0.log" Apr 28 19:38:45.130006 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.129972 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvlw8_33cdcd08-3e9e-4229-b74c-2e99bdeb2074/cni-plugins/0.log" Apr 28 19:38:45.153360 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.153326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvlw8_33cdcd08-3e9e-4229-b74c-2e99bdeb2074/bond-cni-plugin/0.log" Apr 28 19:38:45.177523 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.177484 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvlw8_33cdcd08-3e9e-4229-b74c-2e99bdeb2074/routeoverride-cni/0.log" Apr 28 19:38:45.199453 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.199417 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvlw8_33cdcd08-3e9e-4229-b74c-2e99bdeb2074/whereabouts-cni-bincopy/0.log" Apr 28 19:38:45.221152 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.221125 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hvlw8_33cdcd08-3e9e-4229-b74c-2e99bdeb2074/whereabouts-cni/0.log" Apr 28 19:38:45.611816 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.611786 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kbztx_a341bf63-a680-4dba-8ba9-7f2a8180d537/network-metrics-daemon/0.log" Apr 28 19:38:45.633307 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:45.633279 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kbztx_a341bf63-a680-4dba-8ba9-7f2a8180d537/kube-rbac-proxy/0.log" Apr 28 19:38:46.940783 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:46.940753 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8r2c_88d48ce5-a443-48e0-b53f-8ffb2088ab5f/ovn-controller/0.log" Apr 28 19:38:46.968874 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:46.968840 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8r2c_88d48ce5-a443-48e0-b53f-8ffb2088ab5f/ovn-acl-logging/0.log" Apr 28 19:38:46.991078 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:46.991042 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8r2c_88d48ce5-a443-48e0-b53f-8ffb2088ab5f/kube-rbac-proxy-node/0.log" Apr 28 19:38:47.019513 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:47.019486 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8r2c_88d48ce5-a443-48e0-b53f-8ffb2088ab5f/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 19:38:47.043802 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:47.043766 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8r2c_88d48ce5-a443-48e0-b53f-8ffb2088ab5f/northd/0.log" Apr 28 19:38:47.067900 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:47.067859 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8r2c_88d48ce5-a443-48e0-b53f-8ffb2088ab5f/nbdb/0.log" Apr 28 19:38:47.092508 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:47.092482 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8r2c_88d48ce5-a443-48e0-b53f-8ffb2088ab5f/sbdb/0.log" Apr 28 19:38:47.206430 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:47.206389 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8r2c_88d48ce5-a443-48e0-b53f-8ffb2088ab5f/ovnkube-controller/0.log" Apr 28 19:38:48.244881 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:48.244845 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-n6x95_29413dbb-70ef-4e06-8580-98f854320cbb/network-check-target-container/0.log" Apr 28 19:38:49.136603 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:49.136567 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-mqccp_971a7028-d715-4000-af2d-1ba9ff023dae/iptables-alerter/0.log" Apr 28 19:38:49.864164 ip-10-0-134-36 kubenswrapper[2572]: I0428 19:38:49.864131 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8p5gv_e10b87c8-9253-42f3-a4fb-0a8212d8fd26/tuned/0.log"