Apr 16 22:13:39.633510 ip-10-0-133-16 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:40.165467 ip-10-0-133-16 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:40.165467 ip-10-0-133-16 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:40.165467 ip-10-0-133-16 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:40.165467 ip-10-0-133-16 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:40.165467 ip-10-0-133-16 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:40.168587 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.168448 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:40.172263 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172238 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:40.172263 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172258 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:40.172263 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172262 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:40.172263 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172266 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:40.172263 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172269 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:40.172263 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172272 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172276 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172279 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172281 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172285 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172287 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172290 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172292 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172295 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172298 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172300 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172303 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172305 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172308 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172310 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172313 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172316 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172319 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172322 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172328 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:40.172498 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172331 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172334 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172336 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172339 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172342 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172344 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172347 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172350 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172352 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172355 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172357 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172360 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172362 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172365 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172368 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172371 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172375 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172377 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172380 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172383 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:40.173055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172385 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172388 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172393 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172396 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172399 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172402 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172404 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172407 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172410 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172412 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172415 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172417 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172419 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172422 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172425 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172427 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172429 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172432 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172435 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:40.173596 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172437 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172440 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172443 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172446 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172449 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172451 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172454 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172457 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172460 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172463 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172465 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172468 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172472 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172476 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172480 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172483 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172485 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172488 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172490 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:40.174063 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172493 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172495 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172498 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172906 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172912 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172915 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172918 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172920 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172923 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172926 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172928 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172931 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172934 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172936 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172939 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172941 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172944 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172947 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172949 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172952 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:40.174577 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172956 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172959 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172962 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172965 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172967 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172970 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172973 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172976 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172979 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172981 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172984 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172986 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172988 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172991 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172993 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172996 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.172999 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173002 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173004 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:40.175055 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173007 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173009 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173012 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173015 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173017 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173020 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173022 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173025 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173027 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173030 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173032 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173035 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173039 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173041 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173045 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173047 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173050 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173053 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173055 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173058 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:40.175548 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173061 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173064 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173076 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173080 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173084 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173088 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173091 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173094 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173097 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173100 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173119 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173122 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173125 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173128 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173131 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173134 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173138 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173141 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173144 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:40.176036 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173147 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173150 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173153 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173155 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173158 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173161 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173164 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173167 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173170 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173172 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.173175 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173897 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173908 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173917 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173922 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173927 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173930 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173935 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173940 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173943 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173946 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:40.176520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173950 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173953 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173956 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173960 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173963 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173966 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173969 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173972 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173975 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173980 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173983 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173986 2572 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173989 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173992 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.173996 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174000 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174003 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174007 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174010 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174013 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174017 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174020 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174023 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174028 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174031 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:40.177037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174034 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174037 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174041 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174045 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174051 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174054 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174057 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174060 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174063 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174068 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174071 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174074 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174077 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174080 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174086 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174089 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174093 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174096 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174099 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174115 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174119 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174122 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174126 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174129 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174132 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:40.177665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174136 2572 flags.go:64] FLAG: --help="false" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174139 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174142 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174146 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174149 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174152 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174156 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174159 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174162 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174165 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174169 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174172 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174175 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174178 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174181 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174183 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174186 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174189 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174192 2572 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174195 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174198 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174202 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174208 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:40.178296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174211 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174214 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174217 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174220 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174224 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174227 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174230 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174235 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174238 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174242 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174246 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174250 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174252 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174256 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174259 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174262 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174265 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174273 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174277 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174280 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174284 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174287 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174293 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174296 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:40.178845 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174299 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174302 2572 flags.go:64] FLAG: --port="10250" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174305 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174309 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e8618cab3f91a4d0" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174312 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174315 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174319 2572 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174322 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174325 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174329 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174332 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174335 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174359 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174365 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174368 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174372 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174375 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174378 2572 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174381 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174384 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174387 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174390 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174393 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174396 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174399 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174402 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:40.179475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174404 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174407 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174410 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174418 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174421 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174424 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174427 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174433 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174436 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174439 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174444 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174446 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174450 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174453 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174456 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174460 2572 flags.go:64] FLAG: --v="2" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174464 2572 flags.go:64] FLAG: --version="false" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174468 2572 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174473 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.174476 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174575 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174579 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174582 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174585 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:40.180139 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174588 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174591 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174594 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174597 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174615 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174619 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174622 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174626 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174629 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174632 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174635 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174640 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174643 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174646 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174648 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174651 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174654 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174656 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174659 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174661 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:40.180717 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174665 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174667 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174670 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174673 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174677 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174681 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174684 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174687 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174689 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174692 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174695 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174697 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174700 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174703 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174705 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174708 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174710 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174713 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174715 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:40.181306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174718 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174720 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174724 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174726 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174730 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174733 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174736 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174739 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174741 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174744 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174746 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174749 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174751 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174754 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174757 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174759 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174762 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174764 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174766 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174769 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:40.181780 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174771 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174774 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174777 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174781 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174784 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174787 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174789 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174792 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174794 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174797 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174799 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174802 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174804 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174807 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174809 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174812 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174816 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174818 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174820 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:40.182291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174823 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174826 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174828 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.174831 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.175730 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.182295 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.182315 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182373 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182379 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182382 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182385 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182388 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182391 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182394 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182397 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182400 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:40.182769 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182402 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182405 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182408 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182411 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182414 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182416 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182419 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182421 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182424 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182426 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182429 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182431 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182434 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182438 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182442 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182445 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182448 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182451 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182454 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182457 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:40.183304 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182460 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182463 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182465 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182468 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182470 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182473 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182475 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182478 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182480 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182483 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182485 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182488 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182490 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182493 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182495 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182498 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182500 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182503 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182506 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182508 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:40.183793 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182511 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182514 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182516 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182518 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182521 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182524 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182526 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182529 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182531 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182535 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182537 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182540 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182543 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182546 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182550 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182554 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182557 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182560 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182563 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:40.184300 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182566 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182569 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182571 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182574 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182576 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182579 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182582 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182585 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182587 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182590 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182593 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182595 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182597 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182600 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182602 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182605 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182607 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:40.184773 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182610 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.182615 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182723 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182728 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182731 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182734 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182738 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182741 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182744 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182747 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182749 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182752 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182754 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182757 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182759 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:40.185231 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182762 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182764 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182768 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182772 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182775 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182778 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182782 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182785 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182788 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182791 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182793 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182796 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182799 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182801 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182804 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182806 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182808 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182811 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:40.185611 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182813 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182816 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182818 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182821 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182824 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182827 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182830 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182833 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182836 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182838 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182841 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182843 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182846 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182848 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182851 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182853 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182856 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182858 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182860 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182863 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:40.186047 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182865 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182868 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182871 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182873 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182876 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182878 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182881 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182883 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182886 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182888 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182890 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182893 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182895 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182898 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182901 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182903 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182906 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182909 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182912 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182915 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:40.186750 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182918 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182920 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182923 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182926 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182928 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182930 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182933 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182936 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182938 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182941 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182943 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182945 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182948 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182951 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:40.182953 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:40.187489 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.182958 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:40.187863 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.183820 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:40.187863 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.186186 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:40.187863 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.187187 2572 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:40.187863 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.187295 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:40.187863 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.187340 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:40.220374 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.220347 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:40.226348 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.226315 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:40.246216 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.246187 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:40.251466 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.251443 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:40.253379 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.253365 2572 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:40.255019 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.254999 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:40.260254 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.260225 2572 fs.go:135] Filesystem UUIDs: map[01a40a68-7741-49cb-bf36-277d98ef90e9:/dev/nvme0n1p4 09e87fc0-6d4b-4077-965d-938d3a6b05e0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 22:13:40.260358 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.260253 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:40.266574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.266435 2572 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:40.264168922 +0000 UTC m=+0.489335293 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3076017 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec268b26d1786aa6ad0b7867ad6bc06d SystemUUID:ec268b26-d178-6aa6-ad0b-7867ad6bc06d BootID:a4fe3ecd-5b1f-45a6-a5d7-4dc4c0aa4d95 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d8:11:b5:86:49 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d8:11:b5:86:49 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:a2:85:4e:46:ab Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:40.266574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.266556 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:40.266795 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.266776 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:40.267982 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.267947 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:40.268194 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.267982 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-16.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:40.268275 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.268209 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:40.268275 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.268223 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:40.268275 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.268242 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:40.268932 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.268913 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f79qp" Apr 16 22:13:40.268980 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.268948 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:40.270643 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.270629 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:40.270789 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.270778 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:40.273397 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.273384 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:40.273471 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.273411 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:40.273471 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.273430 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:40.273471 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.273443 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:40.273471 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.273458 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:40.275329 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.275311 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:40.275409 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.275344 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:40.276871 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.276847 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f79qp" Apr 16 22:13:40.278853 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.278835 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:40.281264 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.281247 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:40.282587 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282567 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:40.282587 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282589 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282596 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282602 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282608 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282614 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282620 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282626 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282634 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282641 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282649 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:40.282701 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.282658 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:40.283499 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.283486 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:40.283541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.283502 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:40.292143 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.292118 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:40.292698 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.292685 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:40.292771 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.292734 2572 server.go:1295] "Started kubelet" Apr 16 22:13:40.292863 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.292835 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:40.292925 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.292874 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:40.292971 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.292945 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:40.293403 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.293385 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:40.293866 ip-10-0-133-16 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:40.293979 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.293964 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:40.294841 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.294728 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-16.ec2.internal" not found Apr 16 22:13:40.295225 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.295211 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:40.300782 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.300757 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:40.300782 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.300757 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:40.301538 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.301512 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:40.301538 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.301521 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:40.301686 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.301546 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:40.301731 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.301721 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:40.301731 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.301730 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:40.302263 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:40.302238 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-16.ec2.internal\" not found" Apr 16 22:13:40.303066 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.303041 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:40.303269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.303251 2572 factory.go:55] Registering systemd factory Apr 16 22:13:40.303343 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.303280 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:40.303644 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.303627 2572 factory.go:153] Registering CRI-O factory Apr 16 22:13:40.303644 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.303646 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:40.303782 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.303706 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:40.303782 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.303734 2572 factory.go:103] Registering Raw factory Apr 16 22:13:40.303782 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.303751 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:40.304304 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.304279 2572 manager.go:319] Starting recovery of all containers Apr 16 22:13:40.305268 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:40.305243 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:40.305844 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:40.305819 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-16.ec2.internal\" not found" node="ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.310820 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.310798 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-16.ec2.internal" not found Apr 16 22:13:40.313670 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.313654 2572 manager.go:324] Recovery completed Apr 16 22:13:40.320349 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.320333 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:40.322476 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.322458 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:40.322551 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.322496 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:40.322551 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.322512 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:40.323046 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.323029 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:40.323046 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.323043 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:40.323169 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.323063 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:40.325921 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.325909 2572 policy_none.go:49] "None policy: Start" Apr 16 22:13:40.325985 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.325924 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:40.325985 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.325934 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:40.366843 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.366766 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:40.366843 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:40.366812 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:40.366843 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.366825 2572 server.go:85] "Starting device plugin registration server" Apr 16 22:13:40.370598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.367090 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:40.370598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.367119 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:40.370598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.367206 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:40.370598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.367322 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:40.370598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.367332 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:40.370598 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:40.367749 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:40.370598 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:40.367790 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-16.ec2.internal\" not found" Apr 16 22:13:40.371907 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.371892 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-16.ec2.internal" not found Apr 16 22:13:40.437889 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.437804 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:40.439091 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.439074 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:40.439151 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.439117 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:40.439151 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.439141 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:40.439151 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.439147 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:40.439253 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:40.439244 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:40.441469 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.441448 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:40.467389 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.467348 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:40.469612 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.469593 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:40.469726 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.469626 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:40.469726 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.469641 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:40.469726 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.469667 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.479005 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.478980 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.539398 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.539356 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal"] Apr 16 22:13:40.542093 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.542075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.542197 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.542120 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.559792 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.559766 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.563793 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.563774 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.574899 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.574876 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:40.577442 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.577424 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:40.704282 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.704177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b67f45154e8dd1656374ab98590cc7b6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal\" (UID: \"b67f45154e8dd1656374ab98590cc7b6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.704282 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.704222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1f3ee50f97b2910d9326d3a396c451df-config\") pod \"kube-apiserver-proxy-ip-10-0-133-16.ec2.internal\" (UID: \"1f3ee50f97b2910d9326d3a396c451df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.704282 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.704246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b67f45154e8dd1656374ab98590cc7b6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal\" (UID: \"b67f45154e8dd1656374ab98590cc7b6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.804997 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.804967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1f3ee50f97b2910d9326d3a396c451df-config\") pod \"kube-apiserver-proxy-ip-10-0-133-16.ec2.internal\" (UID: \"1f3ee50f97b2910d9326d3a396c451df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.804997 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.805003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b67f45154e8dd1656374ab98590cc7b6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal\" (UID: \"b67f45154e8dd1656374ab98590cc7b6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.805219 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.805021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b67f45154e8dd1656374ab98590cc7b6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal\" (UID: \"b67f45154e8dd1656374ab98590cc7b6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.805219 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.805093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b67f45154e8dd1656374ab98590cc7b6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal\" (UID: \"b67f45154e8dd1656374ab98590cc7b6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.805219 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.805096 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1f3ee50f97b2910d9326d3a396c451df-config\") pod \"kube-apiserver-proxy-ip-10-0-133-16.ec2.internal\" (UID: \"1f3ee50f97b2910d9326d3a396c451df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.805219 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.805130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b67f45154e8dd1656374ab98590cc7b6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal\" (UID: \"b67f45154e8dd1656374ab98590cc7b6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.877183 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.877147 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" Apr 16 22:13:40.880802 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:40.880780 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" Apr 16 22:13:41.187964 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.187931 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:41.188826 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.188092 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:41.188826 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.188140 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:41.188826 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.188139 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:41.273951 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.273776 2572 apiserver.go:52] "Watching apiserver" Apr 16 22:13:41.279130 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.279086 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:40 +0000 UTC" deadline="2028-01-08 04:16:45.89331515 +0000 UTC" Apr 16 22:13:41.279130 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.279128 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15150h3m4.614189445s" Apr 16 22:13:41.282255 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.282230 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:41.282569 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.282547 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-28c4j","openshift-dns/node-resolver-wr7c8","openshift-image-registry/node-ca-7969m","openshift-multus/multus-additional-cni-plugins-m9jnj","openshift-multus/network-metrics-daemon-bvqqh","openshift-network-diagnostics/network-check-target-zh7bj","openshift-network-operator/iptables-alerter-m2782","kube-system/konnectivity-agent-rkbz7","kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal","openshift-multus/multus-pkxvv","openshift-ovn-kubernetes/ovnkube-node-6bqmn"] Apr 16 22:13:41.284302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.284281 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.285929 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.285910 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.286968 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.286940 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:41.287072 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.287027 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b9t9q\"" Apr 16 22:13:41.287142 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.287067 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:41.287288 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.287270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.288173 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.288154 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:41.288398 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.288381 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:41.288490 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.288474 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k442x\"" Apr 16 22:13:41.288730 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.288713 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.291144 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.289834 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:41.291144 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.290120 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-prx6s\"" Apr 16 22:13:41.291144 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.290693 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:41.291144 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.290756 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:41.291542 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.291513 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:41.291638 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.291523 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:41.291726 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.291706 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:41.291864 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.291848 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:41.291942 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.291890 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:41.292000 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.291955 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:41.292744 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.292724 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tlmhv\"" Apr 16 22:13:41.292833 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.292757 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:41.293061 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.293048 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:41.293157 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.293139 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:41.294361 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.294347 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.295604 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.295587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:13:41.296536 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.296514 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:41.296755 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.296740 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rh7fg\"" Apr 16 22:13:41.296822 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.296804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.296888 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.296843 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:41.297189 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.297174 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:41.297886 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.297872 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:41.297983 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.297947 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.298090 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.297995 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:41.299269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.299251 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:41.299362 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.299275 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:41.299362 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.299279 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vmjgr\"" Apr 16 22:13:41.299495 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.299428 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dpgrh\"" Apr 16 22:13:41.299687 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.299671 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.299796 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.299742 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:41.300472 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.300456 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:41.300564 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.300490 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7v4r4\"" Apr 16 22:13:41.301468 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.301255 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:41.302012 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.301995 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:41.302012 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.302004 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:41.302201 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.302011 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-swkrs\"" Apr 16 22:13:41.302279 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.302258 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:41.302343 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.302329 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:41.302675 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.302659 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:41.303264 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.303249 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:41.303612 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.303593 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:41.307702 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307680 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:41.307788 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/749275aa-640a-4adf-ae88-0d843cc46536-host-slash\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.307788 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-os-release\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.307788 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-socket-dir-parent\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.307946 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-multus-certs\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.307946 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57bb8201-a601-40d9-9277-43dfb8dd33cd-host\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.307946 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/749275aa-640a-4adf-ae88-0d843cc46536-iptables-alerter-script\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.307946 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307904 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-etc-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.307946 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307933 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.307969 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2x2d\" (UniqueName: \"kubernetes.io/projected/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-kube-api-access-g2x2d\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308013 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-slash\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-socket-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysctl-d\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-cni-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-netns\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-daemon-config\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-cni-netd\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308183 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f94db8f-97a3-4835-b7e4-7ef02819127d-agent-certs\") pod \"konnectivity-agent-rkbz7\" (UID: \"5f94db8f-97a3-4835-b7e4-7ef02819127d\") " pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-kubelet\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-run-netns\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-node-log\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6pm2\" (UniqueName: \"kubernetes.io/projected/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-kube-api-access-d6pm2\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-hosts-file\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-tmp-dir\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-sys-fs\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.308433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308453 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-device-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308480 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-etc-selinux\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thd79\" (UniqueName: \"kubernetes.io/projected/08c160f3-c9e8-42dc-beae-941858c69dfe-kube-api-access-thd79\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-kubelet\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308537 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx5md\" (UniqueName: \"kubernetes.io/projected/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-kube-api-access-jx5md\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-systemd-units\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-systemd\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-run\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-var-lib-kubelet\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9b7d\" (UniqueName: \"kubernetes.io/projected/bb0e10b7-e200-4438-8847-608a1a6cace3-kube-api-access-m9b7d\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:41.308688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-etc-kubernetes\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-systemd\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308738 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-registration-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-hostroot\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308775 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-conf-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-ovn\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-log-socket\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-system-cni-dir\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovnkube-config\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308900 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvvd\" (UniqueName: \"kubernetes.io/projected/57bb8201-a601-40d9-9277-43dfb8dd33cd-kube-api-access-jwvvd\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308917 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-sys\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08c160f3-c9e8-42dc-beae-941858c69dfe-tmp\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-k8s-cni-cncf-io\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.309213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.308983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2tqt\" (UniqueName: \"kubernetes.io/projected/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-kube-api-access-d2tqt\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/57bb8201-a601-40d9-9277-43dfb8dd33cd-serviceca\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f94db8f-97a3-4835-b7e4-7ef02819127d-konnectivity-ca\") pod \"konnectivity-agent-rkbz7\" (UID: \"5f94db8f-97a3-4835-b7e4-7ef02819127d\") " pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-lib-modules\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovnkube-script-lib\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxg4\" (UniqueName: \"kubernetes.io/projected/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-kube-api-access-8cxg4\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-kubernetes\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-cni-bin\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-cni-multus\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309575 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-cnibin\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-env-overrides\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309623 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cnibin\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysconfig\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysctl-conf\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-host\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.309859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309762 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-system-cni-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309785 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-cni-binary-copy\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovn-node-metrics-cert\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-modprobe-d\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-tuned\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpxx2\" (UniqueName: \"kubernetes.io/projected/749275aa-640a-4adf-ae88-0d843cc46536-kube-api-access-rpxx2\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-var-lib-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.309986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-cni-bin\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.310004 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-os-release\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.310384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.310020 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.311440 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.311422 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:41.334999 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.334955 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-th6dg" Apr 16 22:13:41.342027 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.342002 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-th6dg" Apr 16 22:13:41.410339 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-cni-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.410483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-netns\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.410483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-daemon-config\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.410483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410407 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-cni-netd\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.410483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-cni-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.410483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.410483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-cni-netd\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f94db8f-97a3-4835-b7e4-7ef02819127d-agent-certs\") pod \"konnectivity-agent-rkbz7\" (UID: \"5f94db8f-97a3-4835-b7e4-7ef02819127d\") " pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-kubelet\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-netns\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-run-netns\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-node-log\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6pm2\" (UniqueName: \"kubernetes.io/projected/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-kube-api-access-d6pm2\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-hosts-file\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-tmp-dir\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-sys-fs\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-device-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-etc-selinux\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thd79\" (UniqueName: \"kubernetes.io/projected/08c160f3-c9e8-42dc-beae-941858c69dfe-kube-api-access-thd79\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410707 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-kubelet\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx5md\" (UniqueName: \"kubernetes.io/projected/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-kube-api-access-jx5md\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-systemd-units\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.410893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-systemd\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-run\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-var-lib-kubelet\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9b7d\" (UniqueName: \"kubernetes.io/projected/bb0e10b7-e200-4438-8847-608a1a6cace3-kube-api-access-m9b7d\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-etc-kubernetes\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-systemd\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-registration-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-hostroot\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410938 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-conf-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-ovn\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-log-socket\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.410999 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-system-cni-dir\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovnkube-config\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvvd\" (UniqueName: \"kubernetes.io/projected/57bb8201-a601-40d9-9277-43dfb8dd33cd-kube-api-access-jwvvd\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.411593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-sys\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08c160f3-c9e8-42dc-beae-941858c69dfe-tmp\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-k8s-cni-cncf-io\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2tqt\" (UniqueName: \"kubernetes.io/projected/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-kube-api-access-d2tqt\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411172 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/57bb8201-a601-40d9-9277-43dfb8dd33cd-serviceca\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-daemon-config\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-run-ovn-kubernetes\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-k8s-cni-cncf-io\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411270 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-log-socket\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411307 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-ovn\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-system-cni-dir\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411351 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-sys\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-systemd\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-run\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411562 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-etc-kubernetes\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-var-lib-kubelet\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411605 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:41.412407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411623 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-systemd\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411661 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-sys-fs\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-registration-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-kubelet\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-hostroot\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411755 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-conf-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-hosts-file\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-run-netns\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-node-log\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovnkube-config\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412238 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-tmp-dir\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-device-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f94db8f-97a3-4835-b7e4-7ef02819127d-konnectivity-ca\") pod \"konnectivity-agent-rkbz7\" (UID: \"5f94db8f-97a3-4835-b7e4-7ef02819127d\") " pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412339 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-etc-selinux\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.411196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5f94db8f-97a3-4835-b7e4-7ef02819127d-konnectivity-ca\") pod \"konnectivity-agent-rkbz7\" (UID: \"5f94db8f-97a3-4835-b7e4-7ef02819127d\") " pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-kubelet\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.413237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-lib-modules\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412380 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-systemd-units\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovnkube-script-lib\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/57bb8201-a601-40d9-9277-43dfb8dd33cd-serviceca\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxg4\" (UniqueName: \"kubernetes.io/projected/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-kube-api-access-8cxg4\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-kubernetes\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-cni-bin\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-lib-modules\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-cni-multus\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-cnibin\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-kubernetes\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-env-overrides\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412570 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-cni-multus\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412575 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cnibin\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-cnibin\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-var-lib-cni-bin\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412610 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cnibin\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.414685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysconfig\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysctl-conf\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-host\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-system-cni-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-cni-binary-copy\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovn-node-metrics-cert\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-modprobe-d\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-tuned\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-host\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpxx2\" (UniqueName: \"kubernetes.io/projected/749275aa-640a-4adf-ae88-0d843cc46536-kube-api-access-rpxx2\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412922 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovnkube-script-lib\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412967 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysconfig\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.412997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-var-lib-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysctl-conf\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-cni-bin\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-env-overrides\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.415541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-cni-bin\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-system-cni-dir\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-os-release\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413171 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-var-lib-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413196 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-run-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/749275aa-640a-4adf-ae88-0d843cc46536-host-slash\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413245 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-os-release\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413270 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-os-release\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-socket-dir-parent\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-multus-certs\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/749275aa-640a-4adf-ae88-0d843cc46536-host-slash\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.413342 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57bb8201-a601-40d9-9277-43dfb8dd33cd-host\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-multus-socket-dir-parent\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.416071 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.413416 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs podName:bb0e10b7-e200-4438-8847-608a1a6cace3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:41.913394187 +0000 UTC m=+2.138560562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs") pod "network-metrics-daemon-bvqqh" (UID: "bb0e10b7-e200-4438-8847-608a1a6cace3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-host-run-multus-certs\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/57bb8201-a601-40d9-9277-43dfb8dd33cd-host\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413441 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-os-release\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/749275aa-640a-4adf-ae88-0d843cc46536-iptables-alerter-script\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413499 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-etc-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413557 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2x2d\" (UniqueName: \"kubernetes.io/projected/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-kube-api-access-g2x2d\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-cni-binary-copy\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-slash\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413614 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-etc-openvswitch\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-socket-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysctl-d\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-cni-binary-copy\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-socket-dir\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-host-slash\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.416584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-modprobe-d\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.417077 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.413787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-sysctl-d\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.417077 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.414034 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/749275aa-640a-4adf-ae88-0d843cc46536-iptables-alerter-script\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.417077 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.415041 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5f94db8f-97a3-4835-b7e4-7ef02819127d-agent-certs\") pod \"konnectivity-agent-rkbz7\" (UID: \"5f94db8f-97a3-4835-b7e4-7ef02819127d\") " pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:13:41.417077 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.415281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08c160f3-c9e8-42dc-beae-941858c69dfe-tmp\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.417077 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.415416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-ovn-node-metrics-cert\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.417077 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.416337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/08c160f3-c9e8-42dc-beae-941858c69dfe-etc-tuned\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.422342 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.422297 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:41.422342 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.422325 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:41.422342 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.422340 2572 projected.go:194] Error preparing data for projected volume kube-api-access-82g6r for pod openshift-network-diagnostics/network-check-target-zh7bj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:41.422550 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.422412 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r podName:6ea887bd-cdbc-451b-9d3d-df42e2023e31 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:41.922392637 +0000 UTC m=+2.147559012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-82g6r" (UniqueName: "kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r") pod "network-check-target-zh7bj" (UID: "6ea887bd-cdbc-451b-9d3d-df42e2023e31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:41.422550 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.422478 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67f45154e8dd1656374ab98590cc7b6.slice/crio-a306a0bc617a0e42dcc46a6cc14a6417f82cb016577c297b8ec92b6a761aaf94 WatchSource:0}: Error finding container a306a0bc617a0e42dcc46a6cc14a6417f82cb016577c297b8ec92b6a761aaf94: Status 404 returned error can't find the container with id a306a0bc617a0e42dcc46a6cc14a6417f82cb016577c297b8ec92b6a761aaf94 Apr 16 22:13:41.422857 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.422830 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f3ee50f97b2910d9326d3a396c451df.slice/crio-7e4f5470dec9bdb8017043e11d581feee235f49b7744c03d62c7191b43dba174 WatchSource:0}: Error finding container 7e4f5470dec9bdb8017043e11d581feee235f49b7744c03d62c7191b43dba174: Status 404 returned error can't find the container with id 7e4f5470dec9bdb8017043e11d581feee235f49b7744c03d62c7191b43dba174 Apr 16 22:13:41.425326 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.425197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpxx2\" (UniqueName: \"kubernetes.io/projected/749275aa-640a-4adf-ae88-0d843cc46536-kube-api-access-rpxx2\") pod \"iptables-alerter-m2782\" (UID: \"749275aa-640a-4adf-ae88-0d843cc46536\") " pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.425326 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.425257 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx5md\" (UniqueName: \"kubernetes.io/projected/e35d7972-4269-49d3-ab9e-fcb9b7abc90d-kube-api-access-jx5md\") pod \"multus-pkxvv\" (UID: \"e35d7972-4269-49d3-ab9e-fcb9b7abc90d\") " pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.425800 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.425661 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxg4\" (UniqueName: \"kubernetes.io/projected/5ca5fa21-17cb-4a9f-867d-d2d0533d72c3-kube-api-access-8cxg4\") pod \"multus-additional-cni-plugins-m9jnj\" (UID: \"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3\") " pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.427679 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.427128 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9b7d\" (UniqueName: \"kubernetes.io/projected/bb0e10b7-e200-4438-8847-608a1a6cace3-kube-api-access-m9b7d\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:41.427679 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.427496 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2x2d\" (UniqueName: \"kubernetes.io/projected/caeffd22-abcf-49f2-8e3c-fa7fa75dc166-kube-api-access-g2x2d\") pod \"aws-ebs-csi-driver-node-fjmlp\" (UID: \"caeffd22-abcf-49f2-8e3c-fa7fa75dc166\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.427679 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.427503 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6pm2\" (UniqueName: \"kubernetes.io/projected/a3bd5cbc-ecad-44b9-9c14-ff88792450fa-kube-api-access-d6pm2\") pod \"ovnkube-node-6bqmn\" (UID: \"a3bd5cbc-ecad-44b9-9c14-ff88792450fa\") " pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.427679 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.427504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2tqt\" (UniqueName: \"kubernetes.io/projected/07fc2833-ed6f-40da-ac15-b1ce9aa369c6-kube-api-access-d2tqt\") pod \"node-resolver-wr7c8\" (UID: \"07fc2833-ed6f-40da-ac15-b1ce9aa369c6\") " pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.427938 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.427888 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thd79\" (UniqueName: \"kubernetes.io/projected/08c160f3-c9e8-42dc-beae-941858c69dfe-kube-api-access-thd79\") pod \"tuned-28c4j\" (UID: \"08c160f3-c9e8-42dc-beae-941858c69dfe\") " pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.428316 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.428299 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvvd\" (UniqueName: \"kubernetes.io/projected/57bb8201-a601-40d9-9277-43dfb8dd33cd-kube-api-access-jwvvd\") pod \"node-ca-7969m\" (UID: \"57bb8201-a601-40d9-9277-43dfb8dd33cd\") " pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.428584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.428557 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:41.429448 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.429434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pkxvv" Apr 16 22:13:41.436241 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.436208 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:13:41.438759 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.438673 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode35d7972_4269_49d3_ab9e_fcb9b7abc90d.slice/crio-d70aa48e179ab26d427c5bf2c29fbe37a6ff6e163d77092fdcdd046675e8bc10 WatchSource:0}: Error finding container d70aa48e179ab26d427c5bf2c29fbe37a6ff6e163d77092fdcdd046675e8bc10: Status 404 returned error can't find the container with id d70aa48e179ab26d427c5bf2c29fbe37a6ff6e163d77092fdcdd046675e8bc10 Apr 16 22:13:41.444394 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.444339 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pkxvv" event={"ID":"e35d7972-4269-49d3-ab9e-fcb9b7abc90d","Type":"ContainerStarted","Data":"d70aa48e179ab26d427c5bf2c29fbe37a6ff6e163d77092fdcdd046675e8bc10"} Apr 16 22:13:41.445721 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.445689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" event={"ID":"1f3ee50f97b2910d9326d3a396c451df","Type":"ContainerStarted","Data":"7e4f5470dec9bdb8017043e11d581feee235f49b7744c03d62c7191b43dba174"} Apr 16 22:13:41.446840 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.446818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" event={"ID":"b67f45154e8dd1656374ab98590cc7b6","Type":"ContainerStarted","Data":"a306a0bc617a0e42dcc46a6cc14a6417f82cb016577c297b8ec92b6a761aaf94"} Apr 16 22:13:41.619229 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.619194 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-28c4j" Apr 16 22:13:41.625138 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.625095 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08c160f3_c9e8_42dc_beae_941858c69dfe.slice/crio-6834300013dde519993e3cfb2590da081b88059a52372f243ceeabb51afe733e WatchSource:0}: Error finding container 6834300013dde519993e3cfb2590da081b88059a52372f243ceeabb51afe733e: Status 404 returned error can't find the container with id 6834300013dde519993e3cfb2590da081b88059a52372f243ceeabb51afe733e Apr 16 22:13:41.640835 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.640800 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wr7c8" Apr 16 22:13:41.648781 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.648671 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07fc2833_ed6f_40da_ac15_b1ce9aa369c6.slice/crio-a29309b0ae3bd5710bff21ef67284edd3647e028faff8c2fb51fa793b6026dad WatchSource:0}: Error finding container a29309b0ae3bd5710bff21ef67284edd3647e028faff8c2fb51fa793b6026dad: Status 404 returned error can't find the container with id a29309b0ae3bd5710bff21ef67284edd3647e028faff8c2fb51fa793b6026dad Apr 16 22:13:41.654474 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.654447 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7969m" Apr 16 22:13:41.660923 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.660883 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57bb8201_a601_40d9_9277_43dfb8dd33cd.slice/crio-bebdb6839431abe00f6c1c4b3d036f1cc18889df62db5675d28608300f89fbd4 WatchSource:0}: Error finding container bebdb6839431abe00f6c1c4b3d036f1cc18889df62db5675d28608300f89fbd4: Status 404 returned error can't find the container with id bebdb6839431abe00f6c1c4b3d036f1cc18889df62db5675d28608300f89fbd4 Apr 16 22:13:41.666001 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.665977 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" Apr 16 22:13:41.672005 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.671979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m2782" Apr 16 22:13:41.672242 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.672218 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca5fa21_17cb_4a9f_867d_d2d0533d72c3.slice/crio-42633e4aec9669defbaeffc2e01c7f141a571c6a204de0323a7033201a52b191 WatchSource:0}: Error finding container 42633e4aec9669defbaeffc2e01c7f141a571c6a204de0323a7033201a52b191: Status 404 returned error can't find the container with id 42633e4aec9669defbaeffc2e01c7f141a571c6a204de0323a7033201a52b191 Apr 16 22:13:41.678311 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.678284 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod749275aa_640a_4adf_ae88_0d843cc46536.slice/crio-20e834d522bbecded16149bed63558dd0ef24f2b22968ea4a447f72f63a98065 WatchSource:0}: Error finding container 20e834d522bbecded16149bed63558dd0ef24f2b22968ea4a447f72f63a98065: Status 404 returned error can't find the container with id 20e834d522bbecded16149bed63558dd0ef24f2b22968ea4a447f72f63a98065 Apr 16 22:13:41.702806 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.702705 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:13:41.710306 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.710269 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f94db8f_97a3_4835_b7e4_7ef02819127d.slice/crio-ce593dcb9e85476d814749303171ef823ecf99d207c20d296b2ae2ffd580409d WatchSource:0}: Error finding container ce593dcb9e85476d814749303171ef823ecf99d207c20d296b2ae2ffd580409d: Status 404 returned error can't find the container with id ce593dcb9e85476d814749303171ef823ecf99d207c20d296b2ae2ffd580409d Apr 16 22:13:41.724163 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.724130 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" Apr 16 22:13:41.730627 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:13:41.730597 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaeffd22_abcf_49f2_8e3c_fa7fa75dc166.slice/crio-27860a4e48032086d27e31b261420306683ccf620a0fded5cc06590fb9d41968 WatchSource:0}: Error finding container 27860a4e48032086d27e31b261420306683ccf620a0fded5cc06590fb9d41968: Status 404 returned error can't find the container with id 27860a4e48032086d27e31b261420306683ccf620a0fded5cc06590fb9d41968 Apr 16 22:13:41.918727 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:41.918129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:41.918727 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.918316 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:41.918727 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:41.918381 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs podName:bb0e10b7-e200-4438-8847-608a1a6cace3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:42.918362741 +0000 UTC m=+3.143529102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs") pod "network-metrics-daemon-bvqqh" (UID: "bb0e10b7-e200-4438-8847-608a1a6cace3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:42.020205 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.019385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:42.020205 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:42.019600 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:42.020205 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:42.019618 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:42.020205 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:42.019631 2572 projected.go:194] Error preparing data for projected volume kube-api-access-82g6r for pod openshift-network-diagnostics/network-check-target-zh7bj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:42.020205 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:42.019689 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r podName:6ea887bd-cdbc-451b-9d3d-df42e2023e31 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:43.019671036 +0000 UTC m=+3.244837400 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-82g6r" (UniqueName: "kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r") pod "network-check-target-zh7bj" (UID: "6ea887bd-cdbc-451b-9d3d-df42e2023e31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:42.315722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.315483 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:42.343885 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.343796 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:41 +0000 UTC" deadline="2027-10-31 23:58:59.235746267 +0000 UTC" Apr 16 22:13:42.343885 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.343828 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13513h45m16.891922876s" Apr 16 22:13:42.449479 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.448553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:42.449479 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.448645 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:42.449479 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:42.448727 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:42.470717 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.470640 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" event={"ID":"caeffd22-abcf-49f2-8e3c-fa7fa75dc166","Type":"ContainerStarted","Data":"27860a4e48032086d27e31b261420306683ccf620a0fded5cc06590fb9d41968"} Apr 16 22:13:42.484699 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.484658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rkbz7" event={"ID":"5f94db8f-97a3-4835-b7e4-7ef02819127d","Type":"ContainerStarted","Data":"ce593dcb9e85476d814749303171ef823ecf99d207c20d296b2ae2ffd580409d"} Apr 16 22:13:42.501981 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.501939 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m2782" event={"ID":"749275aa-640a-4adf-ae88-0d843cc46536","Type":"ContainerStarted","Data":"20e834d522bbecded16149bed63558dd0ef24f2b22968ea4a447f72f63a98065"} Apr 16 22:13:42.512055 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.511976 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7969m" event={"ID":"57bb8201-a601-40d9-9277-43dfb8dd33cd","Type":"ContainerStarted","Data":"bebdb6839431abe00f6c1c4b3d036f1cc18889df62db5675d28608300f89fbd4"} Apr 16 22:13:42.529021 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.528977 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-28c4j" event={"ID":"08c160f3-c9e8-42dc-beae-941858c69dfe","Type":"ContainerStarted","Data":"6834300013dde519993e3cfb2590da081b88059a52372f243ceeabb51afe733e"} Apr 16 22:13:42.533785 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.533743 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" event={"ID":"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3","Type":"ContainerStarted","Data":"42633e4aec9669defbaeffc2e01c7f141a571c6a204de0323a7033201a52b191"} Apr 16 22:13:42.536091 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.536061 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wr7c8" event={"ID":"07fc2833-ed6f-40da-ac15-b1ce9aa369c6","Type":"ContainerStarted","Data":"a29309b0ae3bd5710bff21ef67284edd3647e028faff8c2fb51fa793b6026dad"} Apr 16 22:13:42.558313 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.558269 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"be8db424d5941bb5eaf1d8eca0319b7b2a878520316113f949d2a4df1f662480"} Apr 16 22:13:42.732585 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.732489 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:42.925793 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:42.925750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:42.926062 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:42.925909 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:42.926062 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:42.926000 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs podName:bb0e10b7-e200-4438-8847-608a1a6cace3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:44.925958892 +0000 UTC m=+5.151125254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs") pod "network-metrics-daemon-bvqqh" (UID: "bb0e10b7-e200-4438-8847-608a1a6cace3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:43.027370 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:43.026602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:43.027370 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:43.026800 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:43.027370 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:43.026820 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:43.027370 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:43.026832 2572 projected.go:194] Error preparing data for projected volume kube-api-access-82g6r for pod openshift-network-diagnostics/network-check-target-zh7bj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:43.027370 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:43.026894 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r podName:6ea887bd-cdbc-451b-9d3d-df42e2023e31 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.026874563 +0000 UTC m=+5.252040924 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-82g6r" (UniqueName: "kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r") pod "network-check-target-zh7bj" (UID: "6ea887bd-cdbc-451b-9d3d-df42e2023e31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:43.345062 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:43.345016 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:41 +0000 UTC" deadline="2027-09-17 11:20:55.009741397 +0000 UTC" Apr 16 22:13:43.345062 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:43.345059 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12445h7m11.664686482s" Apr 16 22:13:43.439641 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:43.439604 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:43.439819 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:43.439737 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:44.444134 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:44.443029 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:44.444134 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:44.443181 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:44.947838 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:44.947760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:44.948040 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:44.947914 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:44.948040 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:44.947979 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs podName:bb0e10b7-e200-4438-8847-608a1a6cace3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:48.947961026 +0000 UTC m=+9.173127388 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs") pod "network-metrics-daemon-bvqqh" (UID: "bb0e10b7-e200-4438-8847-608a1a6cace3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.048493 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:45.048454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:45.048682 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:45.048637 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:45.048682 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:45.048673 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:45.048800 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:45.048687 2572 projected.go:194] Error preparing data for projected volume kube-api-access-82g6r for pod openshift-network-diagnostics/network-check-target-zh7bj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:45.048800 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:45.048752 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r podName:6ea887bd-cdbc-451b-9d3d-df42e2023e31 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:49.048732643 +0000 UTC m=+9.273899007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-82g6r" (UniqueName: "kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r") pod "network-check-target-zh7bj" (UID: "6ea887bd-cdbc-451b-9d3d-df42e2023e31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:45.439888 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:45.439849 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:45.440083 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:45.440001 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:46.440198 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:46.440159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:46.440805 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:46.440297 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:47.439456 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:47.439415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:47.439639 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:47.439563 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:48.440237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:48.440027 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:48.440237 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:48.440191 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:48.985177 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:48.984541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:48.985177 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:48.984719 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:48.985177 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:48.984787 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs podName:bb0e10b7-e200-4438-8847-608a1a6cace3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:56.984766533 +0000 UTC m=+17.209932896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs") pod "network-metrics-daemon-bvqqh" (UID: "bb0e10b7-e200-4438-8847-608a1a6cace3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:49.085757 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:49.085717 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:49.085932 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:49.085881 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:49.085932 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:49.085898 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:49.085932 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:49.085910 2572 projected.go:194] Error preparing data for projected volume kube-api-access-82g6r for pod openshift-network-diagnostics/network-check-target-zh7bj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.086099 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:49.085975 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r podName:6ea887bd-cdbc-451b-9d3d-df42e2023e31 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:57.085953537 +0000 UTC m=+17.311119895 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-82g6r" (UniqueName: "kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r") pod "network-check-target-zh7bj" (UID: "6ea887bd-cdbc-451b-9d3d-df42e2023e31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:49.440227 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:49.440178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:49.440428 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:49.440288 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:50.441034 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:50.440971 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:50.441515 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:50.441118 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:51.440384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:51.440344 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:51.440581 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:51.440470 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:52.439565 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:52.439530 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:52.439980 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:52.439685 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:53.439828 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:53.439784 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:53.440328 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:53.439925 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:54.439495 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:54.439452 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:54.439669 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:54.439585 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:55.439538 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:55.439500 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:55.439951 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:55.439625 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:56.440338 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:56.440299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:56.440797 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:56.440438 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:57.043725 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:57.043682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:57.043912 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:57.043855 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:57.043969 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:57.043938 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs podName:bb0e10b7-e200-4438-8847-608a1a6cace3 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.043917268 +0000 UTC m=+33.269083626 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs") pod "network-metrics-daemon-bvqqh" (UID: "bb0e10b7-e200-4438-8847-608a1a6cace3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:57.144308 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:57.144258 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:57.144508 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:57.144456 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:57.144508 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:57.144479 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:57.144508 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:57.144492 2572 projected.go:194] Error preparing data for projected volume kube-api-access-82g6r for pod openshift-network-diagnostics/network-check-target-zh7bj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:57.144680 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:57.144557 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r podName:6ea887bd-cdbc-451b-9d3d-df42e2023e31 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.144537071 +0000 UTC m=+33.369703449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-82g6r" (UniqueName: "kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r") pod "network-check-target-zh7bj" (UID: "6ea887bd-cdbc-451b-9d3d-df42e2023e31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:57.440164 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:57.440062 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:57.440339 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:57.440211 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:13:58.439514 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:58.439476 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:13:58.439980 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:58.439616 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:13:59.439688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:13:59.439649 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:13:59.440028 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:13:59.439758 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:14:00.443514 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.443184 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:00.444183 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:00.443643 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:14:00.598055 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.597238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pkxvv" event={"ID":"e35d7972-4269-49d3-ab9e-fcb9b7abc90d","Type":"ContainerStarted","Data":"5539d4da2986c8e489a541023b40552e17450205756c5f3803277e7c9c71847b"} Apr 16 22:14:00.599482 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.599422 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" event={"ID":"1f3ee50f97b2910d9326d3a396c451df","Type":"ContainerStarted","Data":"f5d9f603d2c931c8df7f784193ad70110f582d0d73d469a3675654f3f5e09979"} Apr 16 22:14:00.615363 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.615335 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:14:00.615772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.615741 2572 generic.go:358] "Generic (PLEG): container finished" podID="a3bd5cbc-ecad-44b9-9c14-ff88792450fa" containerID="ba9f629b0339ca9bd7bf08aacefa2b43eb5be8a88bfccd80f22aaea7a0b077b6" exitCode=1 Apr 16 22:14:00.615903 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.615833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"10564de3c6077f7b9336815fe10bed353e353c7ae15f9ba9c7a81499f8a04948"} Apr 16 22:14:00.615903 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.615873 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"2b2f3fbd62b5b49535faaa9bb155f2b6705ca4cbb3b85711a007450de4c044ad"} Apr 16 22:14:00.615903 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.615886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"844cc14e76ee8e45e06a62a854fcea5f775e464e548c0a39434a92a7bdf24e72"} Apr 16 22:14:00.615903 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.615897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"bbd4e5c09501d052c6f1b8741b807fbb56377fe4ff2d9a52729a9751cb2b6b8a"} Apr 16 22:14:00.616043 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.615910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerDied","Data":"ba9f629b0339ca9bd7bf08aacefa2b43eb5be8a88bfccd80f22aaea7a0b077b6"} Apr 16 22:14:00.616043 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.615926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"4acd1dae862951d713e3616472ba57ead94b8273f62549c7f1f6bff59e4c1b55"} Apr 16 22:14:00.618591 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.618352 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pkxvv" podStartSLOduration=2.257889068 podStartE2EDuration="20.618340171s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.443511285 +0000 UTC m=+1.668677642" lastFinishedPulling="2026-04-16 22:13:59.803962387 +0000 UTC m=+20.029128745" observedRunningTime="2026-04-16 22:14:00.618062181 +0000 UTC m=+20.843228572" watchObservedRunningTime="2026-04-16 22:14:00.618340171 +0000 UTC m=+20.843506552" Apr 16 22:14:00.623722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.623592 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-28c4j" event={"ID":"08c160f3-c9e8-42dc-beae-941858c69dfe","Type":"ContainerStarted","Data":"41861a9f56e96a66be5bd8f9a8fd8fe2d198291f01de256714f295b124f162c1"} Apr 16 22:14:00.632336 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.632272 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-16.ec2.internal" podStartSLOduration=20.632254595 podStartE2EDuration="20.632254595s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:00.632192164 +0000 UTC m=+20.857358544" watchObservedRunningTime="2026-04-16 22:14:00.632254595 +0000 UTC m=+20.857420978" Apr 16 22:14:00.650490 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:00.650435 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-28c4j" podStartSLOduration=2.486134464 podStartE2EDuration="20.650416075s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.626626111 +0000 UTC m=+1.851792472" lastFinishedPulling="2026-04-16 22:13:59.790907711 +0000 UTC m=+20.016074083" observedRunningTime="2026-04-16 22:14:00.649829447 +0000 UTC m=+20.874995827" watchObservedRunningTime="2026-04-16 22:14:00.650416075 +0000 UTC m=+20.875582480" Apr 16 22:14:01.440444 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.440402 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:01.440635 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:01.440543 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:14:01.626762 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.626532 2572 generic.go:358] "Generic (PLEG): container finished" podID="5ca5fa21-17cb-4a9f-867d-d2d0533d72c3" containerID="b9de959212239d653f1d084ed88c06700c50c4ce229914c13a7e6992ed39f85f" exitCode=0 Apr 16 22:14:01.627256 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.626579 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" event={"ID":"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3","Type":"ContainerDied","Data":"b9de959212239d653f1d084ed88c06700c50c4ce229914c13a7e6992ed39f85f"} Apr 16 22:14:01.628688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.628645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wr7c8" event={"ID":"07fc2833-ed6f-40da-ac15-b1ce9aa369c6","Type":"ContainerStarted","Data":"71d5e44e9c229bd8d980ef943206f205d8460739a73e8388d36572b8c48f3314"} Apr 16 22:14:01.630477 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.630455 2572 generic.go:358] "Generic (PLEG): container finished" podID="b67f45154e8dd1656374ab98590cc7b6" containerID="18c8189d69d2f4dfa01bd9d8f7ac0c890657a12af7a991a959697fd68d63f410" exitCode=0 Apr 16 22:14:01.630566 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.630488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" event={"ID":"b67f45154e8dd1656374ab98590cc7b6","Type":"ContainerDied","Data":"18c8189d69d2f4dfa01bd9d8f7ac0c890657a12af7a991a959697fd68d63f410"} Apr 16 22:14:01.632549 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.632512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" event={"ID":"caeffd22-abcf-49f2-8e3c-fa7fa75dc166","Type":"ContainerStarted","Data":"ce909140fa40115e4601b0dc57d3a849140428e781324a33327c94e972ea6c7b"} Apr 16 22:14:01.632549 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.632549 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" event={"ID":"caeffd22-abcf-49f2-8e3c-fa7fa75dc166","Type":"ContainerStarted","Data":"8b5c97e4fb7feea62e7aee3a78f1093457dda2136d9c8e4b0f968759f39162a1"} Apr 16 22:14:01.633095 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.633077 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:14:01.634595 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.634574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rkbz7" event={"ID":"5f94db8f-97a3-4835-b7e4-7ef02819127d","Type":"ContainerStarted","Data":"0810ca8cff399c6a883a5762d836aff67a8d77ca73d7bddeec9869122753a170"} Apr 16 22:14:01.635782 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.635762 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m2782" event={"ID":"749275aa-640a-4adf-ae88-0d843cc46536","Type":"ContainerStarted","Data":"292e68721a035ad1bd8450d4f7a3d057afb8ea2913a1e1841c3aa63ac4ac32df"} Apr 16 22:14:01.637099 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.637051 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7969m" event={"ID":"57bb8201-a601-40d9-9277-43dfb8dd33cd","Type":"ContainerStarted","Data":"aede542484e13456c69a235551648cae30a6bd9bbb96c7649925982e1a8ea04e"} Apr 16 22:14:01.677145 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.677048 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wr7c8" podStartSLOduration=3.537662802 podStartE2EDuration="21.67703132s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.650016554 +0000 UTC m=+1.875182912" lastFinishedPulling="2026-04-16 22:13:59.789385057 +0000 UTC m=+20.014551430" observedRunningTime="2026-04-16 22:14:01.67653305 +0000 UTC m=+21.901699431" watchObservedRunningTime="2026-04-16 22:14:01.67703132 +0000 UTC m=+21.902197701" Apr 16 22:14:01.691961 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.691900 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m2782" podStartSLOduration=3.582269084 podStartE2EDuration="21.691880511s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.679777226 +0000 UTC m=+1.904943583" lastFinishedPulling="2026-04-16 22:13:59.789388652 +0000 UTC m=+20.014555010" observedRunningTime="2026-04-16 22:14:01.691511826 +0000 UTC m=+21.916678209" watchObservedRunningTime="2026-04-16 22:14:01.691880511 +0000 UTC m=+21.917046893" Apr 16 22:14:01.706250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.706201 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rkbz7" podStartSLOduration=3.628156556 podStartE2EDuration="21.706186109s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.711790959 +0000 UTC m=+1.936957317" lastFinishedPulling="2026-04-16 22:13:59.789820499 +0000 UTC m=+20.014986870" observedRunningTime="2026-04-16 22:14:01.706141423 +0000 UTC m=+21.931307803" watchObservedRunningTime="2026-04-16 22:14:01.706186109 +0000 UTC m=+21.931352489" Apr 16 22:14:01.724206 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.724152 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7969m" podStartSLOduration=3.597257597 podStartE2EDuration="21.724133084s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.66261107 +0000 UTC m=+1.887777432" lastFinishedPulling="2026-04-16 22:13:59.78948656 +0000 UTC m=+20.014652919" observedRunningTime="2026-04-16 22:14:01.723374051 +0000 UTC m=+21.948540430" watchObservedRunningTime="2026-04-16 22:14:01.724133084 +0000 UTC m=+21.949299465" Apr 16 22:14:01.744123 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.744082 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:14:01.744762 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:01.744735 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:14:02.377444 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.377323 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:14:01.633091934Z","UUID":"6325a4e4-094b-4ac8-a719-4ce071b927a2","Handler":null,"Name":"","Endpoint":""} Apr 16 22:14:02.379269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.379246 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:14:02.379402 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.379277 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:14:02.445740 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.445704 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:02.445972 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:02.445911 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:14:02.642011 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.641947 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:14:02.642763 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.642306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"d85976ac64e9cba104614e2c93a23e87c9dc258a6113122863a87016a4c722a3"} Apr 16 22:14:02.644583 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.644549 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" event={"ID":"b67f45154e8dd1656374ab98590cc7b6","Type":"ContainerStarted","Data":"86e7b487b33aa534635fcc18641a983af4c6ecaef706f68d9a8ec4ee8fcd0c6b"} Apr 16 22:14:02.646867 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.646832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" event={"ID":"caeffd22-abcf-49f2-8e3c-fa7fa75dc166","Type":"ContainerStarted","Data":"2a38f9751333af80e77fa5fcc491564d94a9133dce2bf512363940ba10557b0a"} Apr 16 22:14:02.647756 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.647733 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:14:02.647863 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.647788 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rkbz7" Apr 16 22:14:02.663378 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:02.663322 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-16.ec2.internal" podStartSLOduration=22.663307 podStartE2EDuration="22.663307s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:02.663170923 +0000 UTC m=+22.888337304" watchObservedRunningTime="2026-04-16 22:14:02.663307 +0000 UTC m=+22.888473376" Apr 16 22:14:03.439584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:03.439551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:03.439820 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:03.439671 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:14:04.439596 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:04.439567 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:04.440256 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:04.439713 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:14:05.439948 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.439912 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:05.440432 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:05.440021 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:14:05.657046 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.656868 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:14:05.657452 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.657422 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"ea599ef900599d435c004ec78c113490d92e63a68a5004815cca2baab3b4a947"} Apr 16 22:14:05.659322 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.658703 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:14:05.659322 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.658998 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:14:05.659322 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.659188 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:14:05.661743 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.661723 2572 scope.go:117] "RemoveContainer" containerID="ba9f629b0339ca9bd7bf08aacefa2b43eb5be8a88bfccd80f22aaea7a0b077b6" Apr 16 22:14:05.677766 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.677740 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:14:05.677979 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.677958 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:14:05.688293 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:05.688251 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fjmlp" podStartSLOduration=4.996493757 podStartE2EDuration="25.688236497s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.732211269 +0000 UTC m=+1.957377627" lastFinishedPulling="2026-04-16 22:14:02.423953993 +0000 UTC m=+22.649120367" observedRunningTime="2026-04-16 22:14:02.695360167 +0000 UTC m=+22.920526548" watchObservedRunningTime="2026-04-16 22:14:05.688236497 +0000 UTC m=+25.913402877" Apr 16 22:14:06.439369 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:06.439330 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:06.439585 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:06.439461 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:14:06.661122 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:06.661068 2572 generic.go:358] "Generic (PLEG): container finished" podID="5ca5fa21-17cb-4a9f-867d-d2d0533d72c3" containerID="158b242be78c85a908bcabac4d0d608c2a7cc81759438011c03411f493b24d51" exitCode=0 Apr 16 22:14:06.661614 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:06.661129 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" event={"ID":"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3","Type":"ContainerDied","Data":"158b242be78c85a908bcabac4d0d608c2a7cc81759438011c03411f493b24d51"} Apr 16 22:14:06.664474 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:06.664450 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:14:06.664778 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:06.664754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" event={"ID":"a3bd5cbc-ecad-44b9-9c14-ff88792450fa","Type":"ContainerStarted","Data":"2f5e43006b368ad1022556746e98e73bf299c579cb591f1930fbfe14b2a9b69c"} Apr 16 22:14:06.736607 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:06.736494 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" podStartSLOduration=8.170962626 podStartE2EDuration="26.736477877s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.445244386 +0000 UTC m=+1.670410756" lastFinishedPulling="2026-04-16 22:14:00.010759646 +0000 UTC m=+20.235926007" observedRunningTime="2026-04-16 22:14:06.735901863 +0000 UTC m=+26.961068242" watchObservedRunningTime="2026-04-16 22:14:06.736477877 +0000 UTC m=+26.961644267" Apr 16 22:14:07.440289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:07.440076 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:07.440490 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:07.440358 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:14:07.490883 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:07.490814 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bvqqh"] Apr 16 22:14:07.491013 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:07.490935 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:07.491063 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:07.491022 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:14:07.494001 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:07.493975 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zh7bj"] Apr 16 22:14:07.668415 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:07.668384 2572 generic.go:358] "Generic (PLEG): container finished" podID="5ca5fa21-17cb-4a9f-867d-d2d0533d72c3" containerID="d8dcd1f77c2c241ceb5fa34321c785f835cf2170fe2ba6d0845ae87dce2da386" exitCode=0 Apr 16 22:14:07.668903 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:07.668468 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:07.668903 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:07.668471 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" event={"ID":"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3","Type":"ContainerDied","Data":"d8dcd1f77c2c241ceb5fa34321c785f835cf2170fe2ba6d0845ae87dce2da386"} Apr 16 22:14:07.668903 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:07.668734 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:14:08.673618 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:08.673577 2572 generic.go:358] "Generic (PLEG): container finished" podID="5ca5fa21-17cb-4a9f-867d-d2d0533d72c3" containerID="123c9ba99bc539665449a59856d4e4e417f282cab75df2dd2f73645df4bc3982" exitCode=0 Apr 16 22:14:08.674347 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:08.673646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" event={"ID":"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3","Type":"ContainerDied","Data":"123c9ba99bc539665449a59856d4e4e417f282cab75df2dd2f73645df4bc3982"} Apr 16 22:14:09.440166 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:09.440076 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:09.440319 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:09.440244 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:14:09.440319 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:09.440305 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:09.440465 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:09.440444 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:14:11.439516 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:11.439477 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:11.440024 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:11.439486 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:11.440024 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:11.439615 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zh7bj" podUID="6ea887bd-cdbc-451b-9d3d-df42e2023e31" Apr 16 22:14:11.440024 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:11.439714 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bvqqh" podUID="bb0e10b7-e200-4438-8847-608a1a6cace3" Apr 16 22:14:12.064296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.064081 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-16.ec2.internal" event="NodeReady" Apr 16 22:14:12.064474 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.064425 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:12.107499 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.107461 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mf4m2"] Apr 16 22:14:12.132778 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.132732 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tbnzs"] Apr 16 22:14:12.133386 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.133359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.136183 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.136161 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:12.136183 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.136173 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:12.136341 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.136196 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nlsdl\"" Apr 16 22:14:12.150796 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.150748 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mf4m2"] Apr 16 22:14:12.150796 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.150790 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tbnzs"] Apr 16 22:14:12.151056 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.150920 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:12.153561 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.153539 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-skfrk\"" Apr 16 22:14:12.153705 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.153689 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:12.153974 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.153950 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:12.154290 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.154271 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:12.254602 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.254559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjg94\" (UniqueName: \"kubernetes.io/projected/2357d8ff-11e8-4af7-a6de-7223d6294ade-kube-api-access-mjg94\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.254602 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.254608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbwzc\" (UniqueName: \"kubernetes.io/projected/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-kube-api-access-sbwzc\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:12.254866 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.254724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2357d8ff-11e8-4af7-a6de-7223d6294ade-tmp-dir\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.254866 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.254780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2357d8ff-11e8-4af7-a6de-7223d6294ade-config-volume\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.254866 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.254819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.255000 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.254884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.356471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2357d8ff-11e8-4af7-a6de-7223d6294ade-config-volume\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.356543 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.356606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.356646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjg94\" (UniqueName: \"kubernetes.io/projected/2357d8ff-11e8-4af7-a6de-7223d6294ade-kube-api-access-mjg94\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.356683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbwzc\" (UniqueName: \"kubernetes.io/projected/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-kube-api-access-sbwzc\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.356760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2357d8ff-11e8-4af7-a6de-7223d6294ade-tmp-dir\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.357163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2357d8ff-11e8-4af7-a6de-7223d6294ade-tmp-dir\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.357703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2357d8ff-11e8-4af7-a6de-7223d6294ade-config-volume\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:12.357828 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:12.357898 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls podName:2357d8ff-11e8-4af7-a6de-7223d6294ade nodeName:}" failed. No retries permitted until 2026-04-16 22:14:12.85787728 +0000 UTC m=+33.083043652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls") pod "dns-default-mf4m2" (UID: "2357d8ff-11e8-4af7-a6de-7223d6294ade") : secret "dns-default-metrics-tls" not found Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:12.358477 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:12.358615 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:12.358552 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert podName:fd83defb-549f-4b35-96b4-9fe5c34c7ebf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:12.858523269 +0000 UTC m=+33.083689644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert") pod "ingress-canary-tbnzs" (UID: "fd83defb-549f-4b35-96b4-9fe5c34c7ebf") : secret "canary-serving-cert" not found Apr 16 22:14:12.370706 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.370672 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjg94\" (UniqueName: \"kubernetes.io/projected/2357d8ff-11e8-4af7-a6de-7223d6294ade-kube-api-access-mjg94\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.370890 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.370761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbwzc\" (UniqueName: \"kubernetes.io/projected/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-kube-api-access-sbwzc\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:12.859629 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.859587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:12.860124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:12.859651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:12.860124 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:12.859740 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:12.860124 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:12.859806 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert podName:fd83defb-549f-4b35-96b4-9fe5c34c7ebf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.85978727 +0000 UTC m=+34.084953630 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert") pod "ingress-canary-tbnzs" (UID: "fd83defb-549f-4b35-96b4-9fe5c34c7ebf") : secret "canary-serving-cert" not found Apr 16 22:14:12.860124 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:12.859740 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:12.860124 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:12.859871 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls podName:2357d8ff-11e8-4af7-a6de-7223d6294ade nodeName:}" failed. No retries permitted until 2026-04-16 22:14:13.859858569 +0000 UTC m=+34.085024932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls") pod "dns-default-mf4m2" (UID: "2357d8ff-11e8-4af7-a6de-7223d6294ade") : secret "dns-default-metrics-tls" not found Apr 16 22:14:13.061325 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.061277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:13.061485 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.061433 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:13.061537 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.061514 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs podName:bb0e10b7-e200-4438-8847-608a1a6cace3 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:45.061493346 +0000 UTC m=+65.286659709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs") pod "network-metrics-daemon-bvqqh" (UID: "bb0e10b7-e200-4438-8847-608a1a6cace3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:13.162754 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.162658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:13.162932 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.162829 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:13.162932 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.162857 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:13.162932 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.162872 2572 projected.go:194] Error preparing data for projected volume kube-api-access-82g6r for pod openshift-network-diagnostics/network-check-target-zh7bj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:13.163082 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.162939 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r podName:6ea887bd-cdbc-451b-9d3d-df42e2023e31 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:45.162920953 +0000 UTC m=+65.388087333 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-82g6r" (UniqueName: "kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r") pod "network-check-target-zh7bj" (UID: "6ea887bd-cdbc-451b-9d3d-df42e2023e31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:13.440206 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.440126 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:13.440382 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.440363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:13.443346 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.443183 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:13.443346 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.443209 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:13.444409 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.444387 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-28226\"" Apr 16 22:14:13.444510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.444457 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:13.444510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.444465 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hqwt5\"" Apr 16 22:14:13.870025 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.869987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:13.870426 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:13.870053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:13.870426 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.870164 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:13.870426 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.870165 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:13.870426 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.870232 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls podName:2357d8ff-11e8-4af7-a6de-7223d6294ade nodeName:}" failed. No retries permitted until 2026-04-16 22:14:15.870213393 +0000 UTC m=+36.095379753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls") pod "dns-default-mf4m2" (UID: "2357d8ff-11e8-4af7-a6de-7223d6294ade") : secret "dns-default-metrics-tls" not found Apr 16 22:14:13.870426 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:13.870247 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert podName:fd83defb-549f-4b35-96b4-9fe5c34c7ebf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:15.870241529 +0000 UTC m=+36.095407886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert") pod "ingress-canary-tbnzs" (UID: "fd83defb-549f-4b35-96b4-9fe5c34c7ebf") : secret "canary-serving-cert" not found Apr 16 22:14:15.690546 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:15.690516 2572 generic.go:358] "Generic (PLEG): container finished" podID="5ca5fa21-17cb-4a9f-867d-d2d0533d72c3" containerID="62c529cf41a9bd9b7ebf395a273a601fa3e7045b72e97745a133cd16438ce666" exitCode=0 Apr 16 22:14:15.690940 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:15.690579 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" event={"ID":"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3","Type":"ContainerDied","Data":"62c529cf41a9bd9b7ebf395a273a601fa3e7045b72e97745a133cd16438ce666"} Apr 16 22:14:15.885953 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:15.885918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:15.886128 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:15.886000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:15.886128 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:15.886063 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:15.886128 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:15.886090 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:15.886253 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:15.886145 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert podName:fd83defb-549f-4b35-96b4-9fe5c34c7ebf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:19.886126206 +0000 UTC m=+40.111292590 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert") pod "ingress-canary-tbnzs" (UID: "fd83defb-549f-4b35-96b4-9fe5c34c7ebf") : secret "canary-serving-cert" not found Apr 16 22:14:15.886253 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:15.886161 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls podName:2357d8ff-11e8-4af7-a6de-7223d6294ade nodeName:}" failed. No retries permitted until 2026-04-16 22:14:19.886153953 +0000 UTC m=+40.111320310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls") pod "dns-default-mf4m2" (UID: "2357d8ff-11e8-4af7-a6de-7223d6294ade") : secret "dns-default-metrics-tls" not found Apr 16 22:14:16.695124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:16.695075 2572 generic.go:358] "Generic (PLEG): container finished" podID="5ca5fa21-17cb-4a9f-867d-d2d0533d72c3" containerID="308adfbff0291f4569d18006ba2e3012ba81608f23b42a5cdf53556fae0ea95d" exitCode=0 Apr 16 22:14:16.695530 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:16.695159 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" event={"ID":"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3","Type":"ContainerDied","Data":"308adfbff0291f4569d18006ba2e3012ba81608f23b42a5cdf53556fae0ea95d"} Apr 16 22:14:17.700141 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:17.700089 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" event={"ID":"5ca5fa21-17cb-4a9f-867d-d2d0533d72c3","Type":"ContainerStarted","Data":"c55291374caee9be9dbe2789acd5a750252a5d27cabbd7608ad090cbe1940cc7"} Apr 16 22:14:17.740687 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:17.740637 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m9jnj" podStartSLOduration=4.853480604 podStartE2EDuration="37.740620991s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:13:41.674987844 +0000 UTC m=+1.900154203" lastFinishedPulling="2026-04-16 22:14:14.562128229 +0000 UTC m=+34.787294590" observedRunningTime="2026-04-16 22:14:17.738403325 +0000 UTC m=+37.963569706" watchObservedRunningTime="2026-04-16 22:14:17.740620991 +0000 UTC m=+37.965787371" Apr 16 22:14:19.915093 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:19.914893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:19.915549 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:19.915047 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:19.915549 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:19.915181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:19.915549 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:19.915202 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert podName:fd83defb-549f-4b35-96b4-9fe5c34c7ebf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:27.91518619 +0000 UTC m=+48.140352548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert") pod "ingress-canary-tbnzs" (UID: "fd83defb-549f-4b35-96b4-9fe5c34c7ebf") : secret "canary-serving-cert" not found Apr 16 22:14:19.915549 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:19.915279 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:19.915549 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:19.915334 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls podName:2357d8ff-11e8-4af7-a6de-7223d6294ade nodeName:}" failed. No retries permitted until 2026-04-16 22:14:27.915319121 +0000 UTC m=+48.140485484 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls") pod "dns-default-mf4m2" (UID: "2357d8ff-11e8-4af7-a6de-7223d6294ade") : secret "dns-default-metrics-tls" not found Apr 16 22:14:27.973058 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:27.973015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:27.973058 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:27.973075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:27.973670 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:27.973178 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:27.973670 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:27.973184 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:27.973670 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:27.973242 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert podName:fd83defb-549f-4b35-96b4-9fe5c34c7ebf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:43.973227933 +0000 UTC m=+64.198394292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert") pod "ingress-canary-tbnzs" (UID: "fd83defb-549f-4b35-96b4-9fe5c34c7ebf") : secret "canary-serving-cert" not found Apr 16 22:14:27.973670 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:27.973254 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls podName:2357d8ff-11e8-4af7-a6de-7223d6294ade nodeName:}" failed. No retries permitted until 2026-04-16 22:14:43.97324874 +0000 UTC m=+64.198415099 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls") pod "dns-default-mf4m2" (UID: "2357d8ff-11e8-4af7-a6de-7223d6294ade") : secret "dns-default-metrics-tls" not found Apr 16 22:14:37.681206 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:37.681177 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6bqmn" Apr 16 22:14:43.987121 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:43.987060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:14:43.987599 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:43.987165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:14:43.987599 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:43.987220 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:43.987599 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:43.987253 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:43.987599 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:43.987299 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls podName:2357d8ff-11e8-4af7-a6de-7223d6294ade nodeName:}" failed. No retries permitted until 2026-04-16 22:15:15.987280762 +0000 UTC m=+96.212447125 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls") pod "dns-default-mf4m2" (UID: "2357d8ff-11e8-4af7-a6de-7223d6294ade") : secret "dns-default-metrics-tls" not found Apr 16 22:14:43.987599 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:43.987315 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert podName:fd83defb-549f-4b35-96b4-9fe5c34c7ebf nodeName:}" failed. No retries permitted until 2026-04-16 22:15:15.987308691 +0000 UTC m=+96.212475049 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert") pod "ingress-canary-tbnzs" (UID: "fd83defb-549f-4b35-96b4-9fe5c34c7ebf") : secret "canary-serving-cert" not found Apr 16 22:14:45.094927 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.094887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:14:45.097908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.097888 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:45.105343 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:45.105318 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:45.105451 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:14:45.105394 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs podName:bb0e10b7-e200-4438-8847-608a1a6cace3 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:49.105372791 +0000 UTC m=+129.330539149 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs") pod "network-metrics-daemon-bvqqh" (UID: "bb0e10b7-e200-4438-8847-608a1a6cace3") : secret "metrics-daemon-secret" not found Apr 16 22:14:45.195511 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.195461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:45.198159 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.198133 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:45.208996 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.208968 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:45.220700 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.220667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82g6r\" (UniqueName: \"kubernetes.io/projected/6ea887bd-cdbc-451b-9d3d-df42e2023e31-kube-api-access-82g6r\") pod \"network-check-target-zh7bj\" (UID: \"6ea887bd-cdbc-451b-9d3d-df42e2023e31\") " pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:45.254423 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.254387 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-28226\"" Apr 16 22:14:45.261743 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.261720 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:45.434668 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.434637 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zh7bj"] Apr 16 22:14:45.437734 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:14:45.437704 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ea887bd_cdbc_451b_9d3d_df42e2023e31.slice/crio-f895e2c052f6cfc5b46834c27b25a2fa76b920ce25d6a0544b223ca0c9af1c0d WatchSource:0}: Error finding container f895e2c052f6cfc5b46834c27b25a2fa76b920ce25d6a0544b223ca0c9af1c0d: Status 404 returned error can't find the container with id f895e2c052f6cfc5b46834c27b25a2fa76b920ce25d6a0544b223ca0c9af1c0d Apr 16 22:14:45.755545 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:45.755450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zh7bj" event={"ID":"6ea887bd-cdbc-451b-9d3d-df42e2023e31","Type":"ContainerStarted","Data":"f895e2c052f6cfc5b46834c27b25a2fa76b920ce25d6a0544b223ca0c9af1c0d"} Apr 16 22:14:48.763124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:48.763071 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zh7bj" event={"ID":"6ea887bd-cdbc-451b-9d3d-df42e2023e31","Type":"ContainerStarted","Data":"357e5ae4f8c42ceec3bd3b8db62ae304f696590d7097d2fe5398128156ccad45"} Apr 16 22:14:48.763565 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:48.763235 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:14:48.778333 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:14:48.778272 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zh7bj" podStartSLOduration=65.995153169 podStartE2EDuration="1m8.778250929s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:14:45.439806846 +0000 UTC m=+65.664973204" lastFinishedPulling="2026-04-16 22:14:48.222904602 +0000 UTC m=+68.448070964" observedRunningTime="2026-04-16 22:14:48.777630561 +0000 UTC m=+69.002796940" watchObservedRunningTime="2026-04-16 22:14:48.778250929 +0000 UTC m=+69.003417309" Apr 16 22:15:16.014906 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:16.014858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:15:16.014906 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:16.014910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:15:16.015449 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:16.014999 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:16.015449 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:16.015015 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:16.015449 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:16.015065 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls podName:2357d8ff-11e8-4af7-a6de-7223d6294ade nodeName:}" failed. No retries permitted until 2026-04-16 22:16:20.015047497 +0000 UTC m=+160.240213855 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls") pod "dns-default-mf4m2" (UID: "2357d8ff-11e8-4af7-a6de-7223d6294ade") : secret "dns-default-metrics-tls" not found Apr 16 22:15:16.015449 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:16.015079 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert podName:fd83defb-549f-4b35-96b4-9fe5c34c7ebf nodeName:}" failed. No retries permitted until 2026-04-16 22:16:20.015072627 +0000 UTC m=+160.240238985 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert") pod "ingress-canary-tbnzs" (UID: "fd83defb-549f-4b35-96b4-9fe5c34c7ebf") : secret "canary-serving-cert" not found Apr 16 22:15:19.767515 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:19.767479 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zh7bj" Apr 16 22:15:20.618864 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.618825 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g"] Apr 16 22:15:20.621225 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.621206 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.623675 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.623647 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 22:15:20.625082 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.625060 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:15:20.626589 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.626564 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 22:15:20.626704 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.626633 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:15:20.626704 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.626633 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-lmn2g\"" Apr 16 22:15:20.628478 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.628424 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g"] Apr 16 22:15:20.723905 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.723869 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5c7c6f9bb8-k7dmc"] Apr 16 22:15:20.725742 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.725725 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.731351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.731318 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 22:15:20.731649 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.731367 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 22:15:20.731649 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.731383 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:15:20.731649 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.731576 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 22:15:20.731936 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.731792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xmhbl\"" Apr 16 22:15:20.731936 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.731879 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:15:20.732038 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.731987 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 22:15:20.736732 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.736710 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5c7c6f9bb8-k7dmc"] Apr 16 22:15:20.744224 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.744200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.744338 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.744253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcgll\" (UniqueName: \"kubernetes.io/projected/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-kube-api-access-hcgll\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.744338 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.744286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.821255 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.821223 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7"] Apr 16 22:15:20.822947 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.822929 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7" Apr 16 22:15:20.825992 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.825953 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-djxkf\"" Apr 16 22:15:20.825992 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.825972 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:20.826204 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.826088 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:20.827598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.827574 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz"] Apr 16 22:15:20.830611 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.830588 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2qjf4"] Apr 16 22:15:20.830757 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.830740 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz" Apr 16 22:15:20.832280 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.832260 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg"] Apr 16 22:15:20.832411 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.832396 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:20.833326 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.833308 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-dkk6q\"" Apr 16 22:15:20.834181 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.834165 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:20.834668 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.834647 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:15:20.834773 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.834677 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 22:15:20.834773 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.834702 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lrpt5\"" Apr 16 22:15:20.834773 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.834758 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 22:15:20.834940 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.834909 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:15:20.836817 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.836794 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:20.836934 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.836855 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jp8mt\"" Apr 16 22:15:20.837055 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.837041 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:20.838249 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.838232 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 22:15:20.840495 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.840472 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 22:15:20.841061 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.840871 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7"] Apr 16 22:15:20.842523 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.842503 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz"] Apr 16 22:15:20.844952 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.844924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-stats-auth\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.845073 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.844966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjftj\" (UniqueName: \"kubernetes.io/projected/5e1bbae3-73bb-42df-9209-225120eedbb0-kube-api-access-zjftj\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.845073 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.845014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.845073 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.845066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-default-certificate\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.845269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.845101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.845269 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:20.845137 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:20.845269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.845159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcgll\" (UniqueName: \"kubernetes.io/projected/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-kube-api-access-hcgll\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.845269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.845184 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.845269 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:20.845211 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls podName:ee7387b2-e00d-45fd-881b-86e9cd70ab5b nodeName:}" failed. No retries permitted until 2026-04-16 22:15:21.345190193 +0000 UTC m=+101.570356570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bjs2g" (UID: "ee7387b2-e00d-45fd-881b-86e9cd70ab5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:20.845514 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.845409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.846410 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.846389 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.846766 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.846740 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2qjf4"] Apr 16 22:15:20.855335 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.855305 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg"] Apr 16 22:15:20.864193 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.864162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcgll\" (UniqueName: \"kubernetes.io/projected/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-kube-api-access-hcgll\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:20.921294 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.921204 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv"] Apr 16 22:15:20.923503 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.923485 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:20.925962 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.925939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 22:15:20.926154 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.926044 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:20.926233 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.926165 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 22:15:20.926309 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.926295 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:20.926434 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.926416 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-twkd5\"" Apr 16 22:15:20.928704 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.928681 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq"] Apr 16 22:15:20.931604 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.930961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:20.933295 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.933265 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv"] Apr 16 22:15:20.933920 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.933900 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 22:15:20.934208 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.934188 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:20.934312 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.934285 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:20.934382 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.934364 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-76dwz\"" Apr 16 22:15:20.934435 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.934389 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 22:15:20.956494 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956451 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvrfd\" (UniqueName: \"kubernetes.io/projected/8642ce80-b932-4245-8f5a-3ce2e6014659-kube-api-access-wvrfd\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:20.956671 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:20.956671 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956566 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-stats-auth\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.956671 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8642ce80-b932-4245-8f5a-3ce2e6014659-tmp\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:20.956671 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjftj\" (UniqueName: \"kubernetes.io/projected/5e1bbae3-73bb-42df-9209-225120eedbb0-kube-api-access-zjftj\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.956868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klsl5\" (UniqueName: \"kubernetes.io/projected/2bfe578c-4062-4475-9e81-7dcc34aef95a-kube-api-access-klsl5\") pod \"network-check-source-8894fc9bd-8m6vz\" (UID: \"2bfe578c-4062-4475-9e81-7dcc34aef95a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz" Apr 16 22:15:20.956868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt7qh\" (UniqueName: \"kubernetes.io/projected/8c11f0d2-8369-4d94-8465-bfa195f8781d-kube-api-access-zt7qh\") pod \"volume-data-source-validator-7c6cbb6c87-ghzq7\" (UID: \"8c11f0d2-8369-4d94-8465-bfa195f8781d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7" Apr 16 22:15:20.956868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-default-certificate\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.956868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.956868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956805 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8642ce80-b932-4245-8f5a-3ce2e6014659-snapshots\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:20.956868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956843 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8642ce80-b932-4245-8f5a-3ce2e6014659-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:20.957192 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956871 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.957192 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956906 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2xh\" (UniqueName: \"kubernetes.io/projected/1c648959-2d38-4121-8bc5-57108f33c786-kube-api-access-qh2xh\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:20.957192 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956943 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8642ce80-b932-4245-8f5a-3ce2e6014659-serving-cert\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:20.957192 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.956970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8642ce80-b932-4245-8f5a-3ce2e6014659-service-ca-bundle\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:20.960319 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:20.957723 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:20.960319 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:20.957873 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:21.457851357 +0000 UTC m=+101.683017716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : secret "router-metrics-certs-default" not found Apr 16 22:15:20.960319 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:20.958244 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:21.458225889 +0000 UTC m=+101.683392264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:20.960319 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.959813 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq"] Apr 16 22:15:20.965126 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.960914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-default-certificate\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.968328 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.968304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-stats-auth\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:20.974632 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:20.974601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjftj\" (UniqueName: \"kubernetes.io/projected/5e1bbae3-73bb-42df-9209-225120eedbb0-kube-api-access-zjftj\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:21.057510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057464 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8642ce80-b932-4245-8f5a-3ce2e6014659-service-ca-bundle\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.057510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057514 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l8v7\" (UniqueName: \"kubernetes.io/projected/64846700-7b7f-4273-b209-6b6a814c538e-kube-api-access-6l8v7\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.057722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057559 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477be877-0af9-4fc8-85b1-24658b74a7a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.057722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zt7qh\" (UniqueName: \"kubernetes.io/projected/8c11f0d2-8369-4d94-8465-bfa195f8781d-kube-api-access-zt7qh\") pod \"volume-data-source-validator-7c6cbb6c87-ghzq7\" (UID: \"8c11f0d2-8369-4d94-8465-bfa195f8781d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7" Apr 16 22:15:21.057722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klsl5\" (UniqueName: \"kubernetes.io/projected/2bfe578c-4062-4475-9e81-7dcc34aef95a-kube-api-access-klsl5\") pod \"network-check-source-8894fc9bd-8m6vz\" (UID: \"2bfe578c-4062-4475-9e81-7dcc34aef95a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz" Apr 16 22:15:21.057722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057701 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64846700-7b7f-4273-b209-6b6a814c538e-config\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.057859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64846700-7b7f-4273-b209-6b6a814c538e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.057859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/477be877-0af9-4fc8-85b1-24658b74a7a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.057859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8642ce80-b932-4245-8f5a-3ce2e6014659-serving-cert\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.057859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvrfd\" (UniqueName: \"kubernetes.io/projected/8642ce80-b932-4245-8f5a-3ce2e6014659-kube-api-access-wvrfd\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.057994 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:21.057994 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8642ce80-b932-4245-8f5a-3ce2e6014659-tmp\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.057994 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.057958 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:21.058090 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.058030 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls podName:1c648959-2d38-4121-8bc5-57108f33c786 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:21.5580125 +0000 UTC m=+101.783178862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d68fg" (UID: "1c648959-2d38-4121-8bc5-57108f33c786") : secret "samples-operator-tls" not found Apr 16 22:15:21.058090 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.057964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8642ce80-b932-4245-8f5a-3ce2e6014659-snapshots\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.058218 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.058146 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8642ce80-b932-4245-8f5a-3ce2e6014659-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.058218 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.058172 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8642ce80-b932-4245-8f5a-3ce2e6014659-service-ca-bundle\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.058218 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.058191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn5j8\" (UniqueName: \"kubernetes.io/projected/477be877-0af9-4fc8-85b1-24658b74a7a8-kube-api-access-sn5j8\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.058370 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.058262 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2xh\" (UniqueName: \"kubernetes.io/projected/1c648959-2d38-4121-8bc5-57108f33c786-kube-api-access-qh2xh\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:21.058465 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.058450 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8642ce80-b932-4245-8f5a-3ce2e6014659-tmp\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.058581 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.058564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8642ce80-b932-4245-8f5a-3ce2e6014659-snapshots\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.058929 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.058909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8642ce80-b932-4245-8f5a-3ce2e6014659-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.060187 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.060167 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8642ce80-b932-4245-8f5a-3ce2e6014659-serving-cert\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.066870 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.066846 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2xh\" (UniqueName: \"kubernetes.io/projected/1c648959-2d38-4121-8bc5-57108f33c786-kube-api-access-qh2xh\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:21.067007 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.066890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt7qh\" (UniqueName: \"kubernetes.io/projected/8c11f0d2-8369-4d94-8465-bfa195f8781d-kube-api-access-zt7qh\") pod \"volume-data-source-validator-7c6cbb6c87-ghzq7\" (UID: \"8c11f0d2-8369-4d94-8465-bfa195f8781d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7" Apr 16 22:15:21.067302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.067286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvrfd\" (UniqueName: \"kubernetes.io/projected/8642ce80-b932-4245-8f5a-3ce2e6014659-kube-api-access-wvrfd\") pod \"insights-operator-585dfdc468-2qjf4\" (UID: \"8642ce80-b932-4245-8f5a-3ce2e6014659\") " pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.067567 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.067550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klsl5\" (UniqueName: \"kubernetes.io/projected/2bfe578c-4062-4475-9e81-7dcc34aef95a-kube-api-access-klsl5\") pod \"network-check-source-8894fc9bd-8m6vz\" (UID: \"2bfe578c-4062-4475-9e81-7dcc34aef95a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz" Apr 16 22:15:21.133075 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.133037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7" Apr 16 22:15:21.142838 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.142811 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz" Apr 16 22:15:21.149632 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.149607 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-2qjf4" Apr 16 22:15:21.159574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.159538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l8v7\" (UniqueName: \"kubernetes.io/projected/64846700-7b7f-4273-b209-6b6a814c538e-kube-api-access-6l8v7\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.159719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.159591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477be877-0af9-4fc8-85b1-24658b74a7a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.159719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.159633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64846700-7b7f-4273-b209-6b6a814c538e-config\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.159719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.159675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64846700-7b7f-4273-b209-6b6a814c538e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.160001 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.159976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/477be877-0af9-4fc8-85b1-24658b74a7a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.160154 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.160130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn5j8\" (UniqueName: \"kubernetes.io/projected/477be877-0af9-4fc8-85b1-24658b74a7a8-kube-api-access-sn5j8\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.160437 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.160354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477be877-0af9-4fc8-85b1-24658b74a7a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.160849 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.160822 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64846700-7b7f-4273-b209-6b6a814c538e-config\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.162532 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.162494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64846700-7b7f-4273-b209-6b6a814c538e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.163157 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.163135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/477be877-0af9-4fc8-85b1-24658b74a7a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.168574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.168542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn5j8\" (UniqueName: \"kubernetes.io/projected/477be877-0af9-4fc8-85b1-24658b74a7a8-kube-api-access-sn5j8\") pod \"kube-storage-version-migrator-operator-6769c5d45-sblhq\" (UID: \"477be877-0af9-4fc8-85b1-24658b74a7a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.168826 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.168808 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l8v7\" (UniqueName: \"kubernetes.io/projected/64846700-7b7f-4273-b209-6b6a814c538e-kube-api-access-6l8v7\") pod \"service-ca-operator-d6fc45fc5-nkvrv\" (UID: \"64846700-7b7f-4273-b209-6b6a814c538e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.234659 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.234624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" Apr 16 22:15:21.257078 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.257034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" Apr 16 22:15:21.284267 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.284204 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7"] Apr 16 22:15:21.287992 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:21.287929 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c11f0d2_8369_4d94_8465_bfa195f8781d.slice/crio-38abc64ee74ffd9450e32cde70b3d4c436676e54ed0e97250eb74014158c6c73 WatchSource:0}: Error finding container 38abc64ee74ffd9450e32cde70b3d4c436676e54ed0e97250eb74014158c6c73: Status 404 returned error can't find the container with id 38abc64ee74ffd9450e32cde70b3d4c436676e54ed0e97250eb74014158c6c73 Apr 16 22:15:21.362631 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.362593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:21.362776 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.362758 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:21.362839 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.362829 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls podName:ee7387b2-e00d-45fd-881b-86e9cd70ab5b nodeName:}" failed. No retries permitted until 2026-04-16 22:15:22.362812738 +0000 UTC m=+102.587979097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bjs2g" (UID: "ee7387b2-e00d-45fd-881b-86e9cd70ab5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:21.366538 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.366509 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv"] Apr 16 22:15:21.371449 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:21.371415 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64846700_7b7f_4273_b209_6b6a814c538e.slice/crio-8f54c4c82da2f33a1d87efe582989f38cb22c17f882e46017aac844601bb83c7 WatchSource:0}: Error finding container 8f54c4c82da2f33a1d87efe582989f38cb22c17f882e46017aac844601bb83c7: Status 404 returned error can't find the container with id 8f54c4c82da2f33a1d87efe582989f38cb22c17f882e46017aac844601bb83c7 Apr 16 22:15:21.385941 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.385911 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq"] Apr 16 22:15:21.389072 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:21.389043 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477be877_0af9_4fc8_85b1_24658b74a7a8.slice/crio-9d0e0b8ec2232c35c2e285b9d3d9632b4433686e0350a45bf9c24a99d9a812dd WatchSource:0}: Error finding container 9d0e0b8ec2232c35c2e285b9d3d9632b4433686e0350a45bf9c24a99d9a812dd: Status 404 returned error can't find the container with id 9d0e0b8ec2232c35c2e285b9d3d9632b4433686e0350a45bf9c24a99d9a812dd Apr 16 22:15:21.463825 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.463729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:21.463825 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.463779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:21.464004 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.463882 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:21.464004 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.463900 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:22.463879769 +0000 UTC m=+102.689046127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:21.464004 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.463940 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:22.463927367 +0000 UTC m=+102.689093730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : secret "router-metrics-certs-default" not found Apr 16 22:15:21.497743 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.497706 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2qjf4"] Apr 16 22:15:21.500797 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.500776 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz"] Apr 16 22:15:21.500925 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:21.500904 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8642ce80_b932_4245_8f5a_3ce2e6014659.slice/crio-74927b6553428bdde0856583cc0cd4d5d63b26aa89f7504c0325eefb494ffe02 WatchSource:0}: Error finding container 74927b6553428bdde0856583cc0cd4d5d63b26aa89f7504c0325eefb494ffe02: Status 404 returned error can't find the container with id 74927b6553428bdde0856583cc0cd4d5d63b26aa89f7504c0325eefb494ffe02 Apr 16 22:15:21.502601 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:21.502583 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bfe578c_4062_4475_9e81_7dcc34aef95a.slice/crio-0d41872b6eaa6401baeedf34b83b1f5987a622e11e53f8b418049b780b637cde WatchSource:0}: Error finding container 0d41872b6eaa6401baeedf34b83b1f5987a622e11e53f8b418049b780b637cde: Status 404 returned error can't find the container with id 0d41872b6eaa6401baeedf34b83b1f5987a622e11e53f8b418049b780b637cde Apr 16 22:15:21.564806 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.564764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:21.565007 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.564905 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:21.565007 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:21.564973 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls podName:1c648959-2d38-4121-8bc5-57108f33c786 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:22.564956222 +0000 UTC m=+102.790122581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d68fg" (UID: "1c648959-2d38-4121-8bc5-57108f33c786") : secret "samples-operator-tls" not found Apr 16 22:15:21.829196 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.829147 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz" event={"ID":"2bfe578c-4062-4475-9e81-7dcc34aef95a","Type":"ContainerStarted","Data":"13cdb9915852054093f9a9260620c09f9a5382bb85940de523497a6838c9608b"} Apr 16 22:15:21.829196 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.829202 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz" event={"ID":"2bfe578c-4062-4475-9e81-7dcc34aef95a","Type":"ContainerStarted","Data":"0d41872b6eaa6401baeedf34b83b1f5987a622e11e53f8b418049b780b637cde"} Apr 16 22:15:21.830641 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.830602 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" event={"ID":"64846700-7b7f-4273-b209-6b6a814c538e","Type":"ContainerStarted","Data":"8f54c4c82da2f33a1d87efe582989f38cb22c17f882e46017aac844601bb83c7"} Apr 16 22:15:21.831706 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.831671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2qjf4" event={"ID":"8642ce80-b932-4245-8f5a-3ce2e6014659","Type":"ContainerStarted","Data":"74927b6553428bdde0856583cc0cd4d5d63b26aa89f7504c0325eefb494ffe02"} Apr 16 22:15:21.832853 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.832827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7" event={"ID":"8c11f0d2-8369-4d94-8465-bfa195f8781d","Type":"ContainerStarted","Data":"38abc64ee74ffd9450e32cde70b3d4c436676e54ed0e97250eb74014158c6c73"} Apr 16 22:15:21.833822 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.833795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" event={"ID":"477be877-0af9-4fc8-85b1-24658b74a7a8","Type":"ContainerStarted","Data":"9d0e0b8ec2232c35c2e285b9d3d9632b4433686e0350a45bf9c24a99d9a812dd"} Apr 16 22:15:21.847642 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:21.847582 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8m6vz" podStartSLOduration=1.847562288 podStartE2EDuration="1.847562288s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:15:21.845243413 +0000 UTC m=+102.070409792" watchObservedRunningTime="2026-04-16 22:15:21.847562288 +0000 UTC m=+102.072728671" Apr 16 22:15:22.373601 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.372931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:22.373601 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.373173 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:22.373601 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.373236 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls podName:ee7387b2-e00d-45fd-881b-86e9cd70ab5b nodeName:}" failed. No retries permitted until 2026-04-16 22:15:24.373217172 +0000 UTC m=+104.598383535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bjs2g" (UID: "ee7387b2-e00d-45fd-881b-86e9cd70ab5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:22.474127 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.474075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:22.474315 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.474155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:22.474388 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.474325 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:24.474304949 +0000 UTC m=+104.699471312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:22.474757 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.474733 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:22.474850 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.474783 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:24.474767561 +0000 UTC m=+104.699933925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : secret "router-metrics-certs-default" not found Apr 16 22:15:22.575204 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.575164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:22.575444 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.575426 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:22.575513 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.575501 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls podName:1c648959-2d38-4121-8bc5-57108f33c786 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:24.57548104 +0000 UTC m=+104.800647423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d68fg" (UID: "1c648959-2d38-4121-8bc5-57108f33c786") : secret "samples-operator-tls" not found Apr 16 22:15:22.654869 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.653679 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-r575f"] Apr 16 22:15:22.656249 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.656219 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:22.659272 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.659239 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 22:15:22.659407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.659353 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-wmljm\"" Apr 16 22:15:22.659658 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.659635 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 22:15:22.666195 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.666172 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-r575f"] Apr 16 22:15:22.777988 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.777943 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/312dcba8-3e17-48d7-9fb2-06758dcd295e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:22.778210 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.778159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:22.879554 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.879380 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:22.879554 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:22.879497 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert podName:312dcba8-3e17-48d7-9fb2-06758dcd295e nodeName:}" failed. No retries permitted until 2026-04-16 22:15:23.379473043 +0000 UTC m=+103.604639405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-r575f" (UID: "312dcba8-3e17-48d7-9fb2-06758dcd295e") : secret "networking-console-plugin-cert" not found Apr 16 22:15:22.880223 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.879180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:22.880223 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.879966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/312dcba8-3e17-48d7-9fb2-06758dcd295e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:22.881634 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:22.880969 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/312dcba8-3e17-48d7-9fb2-06758dcd295e-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:23.385182 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:23.385139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:23.385361 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:23.385293 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:23.385414 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:23.385367 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert podName:312dcba8-3e17-48d7-9fb2-06758dcd295e nodeName:}" failed. No retries permitted until 2026-04-16 22:15:24.385349601 +0000 UTC m=+104.610516178 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-r575f" (UID: "312dcba8-3e17-48d7-9fb2-06758dcd295e") : secret "networking-console-plugin-cert" not found Apr 16 22:15:23.841238 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:23.841196 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7" event={"ID":"8c11f0d2-8369-4d94-8465-bfa195f8781d","Type":"ContainerStarted","Data":"df5b4827934456e693c5b98899cb049f3aa1e73a0093bd02be95e3ad6bf77a6a"} Apr 16 22:15:23.856462 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:23.856315 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ghzq7" podStartSLOduration=2.292967492 podStartE2EDuration="3.856292688s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="2026-04-16 22:15:21.290697466 +0000 UTC m=+101.515863830" lastFinishedPulling="2026-04-16 22:15:22.854022665 +0000 UTC m=+103.079189026" observedRunningTime="2026-04-16 22:15:23.855797603 +0000 UTC m=+104.080963987" watchObservedRunningTime="2026-04-16 22:15:23.856292688 +0000 UTC m=+104.081459069" Apr 16 22:15:24.395184 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.395145 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:24.395184 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.395188 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:24.395708 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.395337 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:24.395708 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.395410 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls podName:ee7387b2-e00d-45fd-881b-86e9cd70ab5b nodeName:}" failed. No retries permitted until 2026-04-16 22:15:28.395391061 +0000 UTC m=+108.620557440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bjs2g" (UID: "ee7387b2-e00d-45fd-881b-86e9cd70ab5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:24.395708 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.395408 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:24.395708 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.395441 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert podName:312dcba8-3e17-48d7-9fb2-06758dcd295e nodeName:}" failed. No retries permitted until 2026-04-16 22:15:26.395432269 +0000 UTC m=+106.620598626 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-r575f" (UID: "312dcba8-3e17-48d7-9fb2-06758dcd295e") : secret "networking-console-plugin-cert" not found Apr 16 22:15:24.495936 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.495897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:24.495936 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.495944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:24.496179 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.496077 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:24.496179 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.496095 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:28.496078441 +0000 UTC m=+108.721244817 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:24.496179 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.496172 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:28.496152272 +0000 UTC m=+108.721318636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : secret "router-metrics-certs-default" not found Apr 16 22:15:24.596785 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.596747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:24.596965 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.596911 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:24.597041 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:24.596994 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls podName:1c648959-2d38-4121-8bc5-57108f33c786 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:28.596972946 +0000 UTC m=+108.822139310 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d68fg" (UID: "1c648959-2d38-4121-8bc5-57108f33c786") : secret "samples-operator-tls" not found Apr 16 22:15:24.846647 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.846606 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" event={"ID":"477be877-0af9-4fc8-85b1-24658b74a7a8","Type":"ContainerStarted","Data":"df27a52ca9ddc6093bed8ec13a7e887ae0162e4d284c70b686852f1f9a0d3ffd"} Apr 16 22:15:24.848093 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.848050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" event={"ID":"64846700-7b7f-4273-b209-6b6a814c538e","Type":"ContainerStarted","Data":"e043bfcfb3eaa6ffa1222dd592423bd46210bbc5118924bde42ed66f93a1a2b8"} Apr 16 22:15:24.849538 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.849505 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2qjf4" event={"ID":"8642ce80-b932-4245-8f5a-3ce2e6014659","Type":"ContainerStarted","Data":"e3a4f135d67c55fc33f427b4d0172d2601654fc193db30b62efed7a278108896"} Apr 16 22:15:24.863603 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.863555 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" podStartSLOduration=1.556758901 podStartE2EDuration="4.863541029s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="2026-04-16 22:15:21.390834157 +0000 UTC m=+101.616000514" lastFinishedPulling="2026-04-16 22:15:24.697616266 +0000 UTC m=+104.922782642" observedRunningTime="2026-04-16 22:15:24.862986265 +0000 UTC m=+105.088152647" watchObservedRunningTime="2026-04-16 22:15:24.863541029 +0000 UTC m=+105.088707403" Apr 16 22:15:24.885405 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.885350 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" podStartSLOduration=1.564590318 podStartE2EDuration="4.88533384s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="2026-04-16 22:15:21.373281903 +0000 UTC m=+101.598448260" lastFinishedPulling="2026-04-16 22:15:24.694025411 +0000 UTC m=+104.919191782" observedRunningTime="2026-04-16 22:15:24.884679495 +0000 UTC m=+105.109845876" watchObservedRunningTime="2026-04-16 22:15:24.88533384 +0000 UTC m=+105.110500219" Apr 16 22:15:24.902479 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:24.902419 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-2qjf4" podStartSLOduration=1.7064587100000002 podStartE2EDuration="4.902400791s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="2026-04-16 22:15:21.502870523 +0000 UTC m=+101.728036881" lastFinishedPulling="2026-04-16 22:15:24.698812601 +0000 UTC m=+104.923978962" observedRunningTime="2026-04-16 22:15:24.901156579 +0000 UTC m=+105.126322960" watchObservedRunningTime="2026-04-16 22:15:24.902400791 +0000 UTC m=+105.127567170" Apr 16 22:15:26.047519 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.047479 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp"] Apr 16 22:15:26.050603 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.050576 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" Apr 16 22:15:26.053349 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.053325 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:26.053495 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.053358 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-q25st\"" Apr 16 22:15:26.054378 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.054356 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 22:15:26.059765 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.059738 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp"] Apr 16 22:15:26.111561 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.111524 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rsq\" (UniqueName: \"kubernetes.io/projected/13cc71b0-ff0e-4ee6-9a87-fef5891bc28a-kube-api-access-r4rsq\") pod \"migrator-74bb7799d9-mgtzp\" (UID: \"13cc71b0-ff0e-4ee6-9a87-fef5891bc28a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" Apr 16 22:15:26.211967 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.211918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rsq\" (UniqueName: \"kubernetes.io/projected/13cc71b0-ff0e-4ee6-9a87-fef5891bc28a-kube-api-access-r4rsq\") pod \"migrator-74bb7799d9-mgtzp\" (UID: \"13cc71b0-ff0e-4ee6-9a87-fef5891bc28a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" Apr 16 22:15:26.219977 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.219951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rsq\" (UniqueName: \"kubernetes.io/projected/13cc71b0-ff0e-4ee6-9a87-fef5891bc28a-kube-api-access-r4rsq\") pod \"migrator-74bb7799d9-mgtzp\" (UID: \"13cc71b0-ff0e-4ee6-9a87-fef5891bc28a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" Apr 16 22:15:26.362346 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.362247 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" Apr 16 22:15:26.413751 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.413700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:26.413901 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:26.413874 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:26.413974 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:26.413945 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert podName:312dcba8-3e17-48d7-9fb2-06758dcd295e nodeName:}" failed. No retries permitted until 2026-04-16 22:15:30.413923084 +0000 UTC m=+110.639089445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-r575f" (UID: "312dcba8-3e17-48d7-9fb2-06758dcd295e") : secret "networking-console-plugin-cert" not found Apr 16 22:15:26.483474 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.483440 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp"] Apr 16 22:15:26.486693 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:26.486659 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13cc71b0_ff0e_4ee6_9a87_fef5891bc28a.slice/crio-7b58fdc15b22bb2f4cbf8d1a9f85b295f094e53802acc41cbeef6df39a5f7429 WatchSource:0}: Error finding container 7b58fdc15b22bb2f4cbf8d1a9f85b295f094e53802acc41cbeef6df39a5f7429: Status 404 returned error can't find the container with id 7b58fdc15b22bb2f4cbf8d1a9f85b295f094e53802acc41cbeef6df39a5f7429 Apr 16 22:15:26.856630 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:26.856593 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" event={"ID":"13cc71b0-ff0e-4ee6-9a87-fef5891bc28a","Type":"ContainerStarted","Data":"7b58fdc15b22bb2f4cbf8d1a9f85b295f094e53802acc41cbeef6df39a5f7429"} Apr 16 22:15:27.860414 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:27.860383 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" event={"ID":"13cc71b0-ff0e-4ee6-9a87-fef5891bc28a","Type":"ContainerStarted","Data":"51109ad255ba3efd6abd4f22250f3e80470bb16a451fd08ccf1972a82d0f4596"} Apr 16 22:15:27.860414 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:27.860422 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" event={"ID":"13cc71b0-ff0e-4ee6-9a87-fef5891bc28a","Type":"ContainerStarted","Data":"60af46dcb629caa4bd7a9315c665a9c5504f09f3985ca46dfa7a475b37949292"} Apr 16 22:15:27.881120 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:27.881054 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mgtzp" podStartSLOduration=0.636643941 podStartE2EDuration="1.881035667s" podCreationTimestamp="2026-04-16 22:15:26 +0000 UTC" firstStartedPulling="2026-04-16 22:15:26.488680818 +0000 UTC m=+106.713847190" lastFinishedPulling="2026-04-16 22:15:27.733072543 +0000 UTC m=+107.958238916" observedRunningTime="2026-04-16 22:15:27.879962894 +0000 UTC m=+108.105129273" watchObservedRunningTime="2026-04-16 22:15:27.881035667 +0000 UTC m=+108.106202048" Apr 16 22:15:28.098642 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.098613 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wr7c8_07fc2833-ed6f-40da-ac15-b1ce9aa369c6/dns-node-resolver/0.log" Apr 16 22:15:28.432144 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.432023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:28.432279 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:28.432221 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:28.432325 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:28.432287 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls podName:ee7387b2-e00d-45fd-881b-86e9cd70ab5b nodeName:}" failed. No retries permitted until 2026-04-16 22:15:36.432269669 +0000 UTC m=+116.657436030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bjs2g" (UID: "ee7387b2-e00d-45fd-881b-86e9cd70ab5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:28.532951 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.532909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:28.533141 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.532959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:28.533141 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:28.533063 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:28.533141 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:28.533097 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:36.533083585 +0000 UTC m=+116.758249942 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:28.533261 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:28.533144 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:36.533134981 +0000 UTC m=+116.758301340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : secret "router-metrics-certs-default" not found Apr 16 22:15:28.633667 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.633622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:28.633801 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:28.633781 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 22:15:28.633869 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:28.633858 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls podName:1c648959-2d38-4121-8bc5-57108f33c786 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:36.633839256 +0000 UTC m=+116.859005615 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d68fg" (UID: "1c648959-2d38-4121-8bc5-57108f33c786") : secret "samples-operator-tls" not found Apr 16 22:15:28.898388 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.898361 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7969m_57bb8201-a601-40d9-9277-43dfb8dd33cd/node-ca/0.log" Apr 16 22:15:28.987422 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.987384 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-66dqz"] Apr 16 22:15:28.989459 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.989443 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:28.992205 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.992179 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 22:15:28.993258 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.993240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-k8kmv\"" Apr 16 22:15:28.993395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.993276 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 22:15:28.993395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.993299 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 22:15:28.993395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.993309 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 22:15:28.998783 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:28.998761 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-66dqz"] Apr 16 22:15:29.139973 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.139938 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crxzc\" (UniqueName: \"kubernetes.io/projected/1b34b8c5-091c-469e-b0ff-eaa822f18c93-kube-api-access-crxzc\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.140173 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.139999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1b34b8c5-091c-469e-b0ff-eaa822f18c93-signing-key\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.140173 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.140100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1b34b8c5-091c-469e-b0ff-eaa822f18c93-signing-cabundle\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.241515 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.241423 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crxzc\" (UniqueName: \"kubernetes.io/projected/1b34b8c5-091c-469e-b0ff-eaa822f18c93-kube-api-access-crxzc\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.241515 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.241497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1b34b8c5-091c-469e-b0ff-eaa822f18c93-signing-key\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.241704 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.241555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1b34b8c5-091c-469e-b0ff-eaa822f18c93-signing-cabundle\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.242238 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.242220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1b34b8c5-091c-469e-b0ff-eaa822f18c93-signing-cabundle\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.244029 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.244003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1b34b8c5-091c-469e-b0ff-eaa822f18c93-signing-key\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.249597 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.249575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crxzc\" (UniqueName: \"kubernetes.io/projected/1b34b8c5-091c-469e-b0ff-eaa822f18c93-kube-api-access-crxzc\") pod \"service-ca-865cb79987-66dqz\" (UID: \"1b34b8c5-091c-469e-b0ff-eaa822f18c93\") " pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.298032 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.298001 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-66dqz" Apr 16 22:15:29.414852 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.414821 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-66dqz"] Apr 16 22:15:29.417694 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:29.417663 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b34b8c5_091c_469e_b0ff_eaa822f18c93.slice/crio-7094cd8f61139560c8cbd191703130360da75beac906c54357a0c770c943233a WatchSource:0}: Error finding container 7094cd8f61139560c8cbd191703130360da75beac906c54357a0c770c943233a: Status 404 returned error can't find the container with id 7094cd8f61139560c8cbd191703130360da75beac906c54357a0c770c943233a Apr 16 22:15:29.867154 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.867122 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-66dqz" event={"ID":"1b34b8c5-091c-469e-b0ff-eaa822f18c93","Type":"ContainerStarted","Data":"76ab288c3e36a50092e081c67b633757bc8bf5022de7976d0fd91bbd5469cb3e"} Apr 16 22:15:29.867154 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.867160 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-66dqz" event={"ID":"1b34b8c5-091c-469e-b0ff-eaa822f18c93","Type":"ContainerStarted","Data":"7094cd8f61139560c8cbd191703130360da75beac906c54357a0c770c943233a"} Apr 16 22:15:29.887280 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:29.887222 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-66dqz" podStartSLOduration=1.8872013669999999 podStartE2EDuration="1.887201367s" podCreationTimestamp="2026-04-16 22:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:15:29.886090597 +0000 UTC m=+110.111256984" watchObservedRunningTime="2026-04-16 22:15:29.887201367 +0000 UTC m=+110.112367746" Apr 16 22:15:30.452612 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:30.452567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:30.453182 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:30.452764 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:30.453182 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:30.452865 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert podName:312dcba8-3e17-48d7-9fb2-06758dcd295e nodeName:}" failed. No retries permitted until 2026-04-16 22:15:38.452841011 +0000 UTC m=+118.678007417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-r575f" (UID: "312dcba8-3e17-48d7-9fb2-06758dcd295e") : secret "networking-console-plugin-cert" not found Apr 16 22:15:36.506067 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:36.506024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:36.506493 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:36.506201 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:36.506493 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:36.506267 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls podName:ee7387b2-e00d-45fd-881b-86e9cd70ab5b nodeName:}" failed. No retries permitted until 2026-04-16 22:15:52.506250333 +0000 UTC m=+132.731416712 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bjs2g" (UID: "ee7387b2-e00d-45fd-881b-86e9cd70ab5b") : secret "cluster-monitoring-operator-tls" not found Apr 16 22:15:36.607150 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:36.607085 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:36.607150 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:36.607159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:36.607356 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:36.607331 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle podName:5e1bbae3-73bb-42df-9209-225120eedbb0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:52.607310309 +0000 UTC m=+132.832476681 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle") pod "router-default-5c7c6f9bb8-k7dmc" (UID: "5e1bbae3-73bb-42df-9209-225120eedbb0") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:36.609676 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:36.609649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5e1bbae3-73bb-42df-9209-225120eedbb0-metrics-certs\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:36.708129 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:36.708065 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:36.710495 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:36.710470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1c648959-2d38-4121-8bc5-57108f33c786-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d68fg\" (UID: \"1c648959-2d38-4121-8bc5-57108f33c786\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:36.755867 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:36.755828 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" Apr 16 22:15:36.887738 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:36.887704 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg"] Apr 16 22:15:37.889653 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:37.889616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" event={"ID":"1c648959-2d38-4121-8bc5-57108f33c786","Type":"ContainerStarted","Data":"9c1859fd7d3a790748b1042925d199db243638b871ab669ff4e882e151e7f55a"} Apr 16 22:15:38.525229 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:38.525190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:38.525407 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:38.525372 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:38.525493 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:15:38.525483 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert podName:312dcba8-3e17-48d7-9fb2-06758dcd295e nodeName:}" failed. No retries permitted until 2026-04-16 22:15:54.525457453 +0000 UTC m=+134.750623832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-r575f" (UID: "312dcba8-3e17-48d7-9fb2-06758dcd295e") : secret "networking-console-plugin-cert" not found Apr 16 22:15:39.895836 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:39.895795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" event={"ID":"1c648959-2d38-4121-8bc5-57108f33c786","Type":"ContainerStarted","Data":"01d6a7a09d88a505acd2a69a70df01234bf5b02c3bf41beb59d179a1e3da1fc2"} Apr 16 22:15:39.895836 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:39.895834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" event={"ID":"1c648959-2d38-4121-8bc5-57108f33c786","Type":"ContainerStarted","Data":"17bf57934e9106ca999790508e3382e5502d44d37269ad524d46c25090eaac9b"} Apr 16 22:15:39.913589 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:39.913540 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d68fg" podStartSLOduration=17.917706241 podStartE2EDuration="19.913527713s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="2026-04-16 22:15:36.92617272 +0000 UTC m=+117.151339080" lastFinishedPulling="2026-04-16 22:15:38.921994194 +0000 UTC m=+119.147160552" observedRunningTime="2026-04-16 22:15:39.911734622 +0000 UTC m=+120.136901002" watchObservedRunningTime="2026-04-16 22:15:39.913527713 +0000 UTC m=+120.138694092" Apr 16 22:15:49.122699 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:49.122655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:15:49.125213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:49.125185 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bb0e10b7-e200-4438-8847-608a1a6cace3-metrics-certs\") pod \"network-metrics-daemon-bvqqh\" (UID: \"bb0e10b7-e200-4438-8847-608a1a6cace3\") " pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:15:49.159244 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:49.159209 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hqwt5\"" Apr 16 22:15:49.167549 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:49.167517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bvqqh" Apr 16 22:15:49.290803 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:49.290634 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bvqqh"] Apr 16 22:15:49.293280 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:49.293254 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb0e10b7_e200_4438_8847_608a1a6cace3.slice/crio-0d0d6458e2b473b21056abe0fed9113a4539c04a711cca0d45724f2863b02abb WatchSource:0}: Error finding container 0d0d6458e2b473b21056abe0fed9113a4539c04a711cca0d45724f2863b02abb: Status 404 returned error can't find the container with id 0d0d6458e2b473b21056abe0fed9113a4539c04a711cca0d45724f2863b02abb Apr 16 22:15:49.926241 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:49.926196 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bvqqh" event={"ID":"bb0e10b7-e200-4438-8847-608a1a6cace3","Type":"ContainerStarted","Data":"0d0d6458e2b473b21056abe0fed9113a4539c04a711cca0d45724f2863b02abb"} Apr 16 22:15:50.930804 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:50.930767 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bvqqh" event={"ID":"bb0e10b7-e200-4438-8847-608a1a6cace3","Type":"ContainerStarted","Data":"e681d60336a62825714a12b90da0f1a0fe6104cd6ce572bc62c4f94a9f7f9210"} Apr 16 22:15:50.930804 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:50.930807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bvqqh" event={"ID":"bb0e10b7-e200-4438-8847-608a1a6cace3","Type":"ContainerStarted","Data":"a3ecf752c98dabd3d212fbf0db770492cfb66a83216184e8bcf6a862f691b041"} Apr 16 22:15:50.947434 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:50.947381 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bvqqh" podStartSLOduration=129.782224067 podStartE2EDuration="2m10.947364281s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:15:49.295588035 +0000 UTC m=+129.520754396" lastFinishedPulling="2026-04-16 22:15:50.460728252 +0000 UTC m=+130.685894610" observedRunningTime="2026-04-16 22:15:50.946319275 +0000 UTC m=+131.171485654" watchObservedRunningTime="2026-04-16 22:15:50.947364281 +0000 UTC m=+131.172530661" Apr 16 22:15:51.876969 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.876932 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-th4nh"] Apr 16 22:15:51.880294 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.880271 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:51.882808 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.882784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:15:51.883066 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.883052 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:15:51.884207 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.884189 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bblwd\"" Apr 16 22:15:51.897832 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.897799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-th4nh"] Apr 16 22:15:51.949221 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.949180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85c31b35-d034-451f-b044-035fb08e99c1-crio-socket\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:51.949221 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.949221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85c31b35-d034-451f-b044-035fb08e99c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:51.949700 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.949247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85c31b35-d034-451f-b044-035fb08e99c1-data-volume\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:51.949700 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.949337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmmmw\" (UniqueName: \"kubernetes.io/projected/85c31b35-d034-451f-b044-035fb08e99c1-kube-api-access-kmmmw\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:51.949700 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.949370 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85c31b35-d034-451f-b044-035fb08e99c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:51.980797 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.980753 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69fb54c674-n5bgr"] Apr 16 22:15:51.984229 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.984204 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:51.987293 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.987271 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s46c2\"" Apr 16 22:15:51.987804 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.987784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:15:51.987941 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.987908 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:15:51.988280 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.988262 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:15:51.993046 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.993026 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:15:51.997274 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:51.997248 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69fb54c674-n5bgr"] Apr 16 22:15:52.050294 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85c31b35-d034-451f-b044-035fb08e99c1-crio-socket\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.050294 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85c31b35-d034-451f-b044-035fb08e99c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-registry-tls\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050338 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85c31b35-d034-451f-b044-035fb08e99c1-data-volume\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5546cbc6-033e-4879-90dc-24493ae81f46-registry-certificates\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/85c31b35-d034-451f-b044-035fb08e99c1-crio-socket\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-bound-sa-token\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050463 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5546cbc6-033e-4879-90dc-24493ae81f46-ca-trust-extracted\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmmmw\" (UniqueName: \"kubernetes.io/projected/85c31b35-d034-451f-b044-035fb08e99c1-kube-api-access-kmmmw\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050524 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8njzn\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-kube-api-access-8njzn\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.050562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050555 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85c31b35-d034-451f-b044-035fb08e99c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.050899 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5546cbc6-033e-4879-90dc-24493ae81f46-image-registry-private-configuration\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.050899 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5546cbc6-033e-4879-90dc-24493ae81f46-trusted-ca\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.050899 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050687 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5546cbc6-033e-4879-90dc-24493ae81f46-installation-pull-secrets\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.050899 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.050757 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/85c31b35-d034-451f-b044-035fb08e99c1-data-volume\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.051093 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.051077 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/85c31b35-d034-451f-b044-035fb08e99c1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.052678 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.052649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/85c31b35-d034-451f-b044-035fb08e99c1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.062824 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.062793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmmmw\" (UniqueName: \"kubernetes.io/projected/85c31b35-d034-451f-b044-035fb08e99c1-kube-api-access-kmmmw\") pod \"insights-runtime-extractor-th4nh\" (UID: \"85c31b35-d034-451f-b044-035fb08e99c1\") " pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.151884 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.151788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8njzn\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-kube-api-access-8njzn\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.151884 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.151855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5546cbc6-033e-4879-90dc-24493ae81f46-image-registry-private-configuration\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.151884 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.151876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5546cbc6-033e-4879-90dc-24493ae81f46-trusted-ca\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.152227 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.151904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5546cbc6-033e-4879-90dc-24493ae81f46-installation-pull-secrets\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.152227 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.152156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-registry-tls\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.152227 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.152196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5546cbc6-033e-4879-90dc-24493ae81f46-registry-certificates\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.152227 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.152212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-bound-sa-token\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.152424 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.152258 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5546cbc6-033e-4879-90dc-24493ae81f46-ca-trust-extracted\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.152889 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.152845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5546cbc6-033e-4879-90dc-24493ae81f46-ca-trust-extracted\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.153068 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.153047 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5546cbc6-033e-4879-90dc-24493ae81f46-registry-certificates\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.153175 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.153088 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5546cbc6-033e-4879-90dc-24493ae81f46-trusted-ca\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.154734 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.154710 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-registry-tls\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.154816 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.154719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5546cbc6-033e-4879-90dc-24493ae81f46-installation-pull-secrets\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.154816 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.154752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5546cbc6-033e-4879-90dc-24493ae81f46-image-registry-private-configuration\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.160614 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.160587 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-bound-sa-token\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.160819 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.160794 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8njzn\" (UniqueName: \"kubernetes.io/projected/5546cbc6-033e-4879-90dc-24493ae81f46-kube-api-access-8njzn\") pod \"image-registry-69fb54c674-n5bgr\" (UID: \"5546cbc6-033e-4879-90dc-24493ae81f46\") " pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.189791 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.189751 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-th4nh" Apr 16 22:15:52.293730 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.293700 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.337638 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.337579 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-th4nh"] Apr 16 22:15:52.342739 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:52.342695 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c31b35_d034_451f_b044_035fb08e99c1.slice/crio-11f90b0e336a87b5877addec31149928d5b336dec44d068b77dae7916e597959 WatchSource:0}: Error finding container 11f90b0e336a87b5877addec31149928d5b336dec44d068b77dae7916e597959: Status 404 returned error can't find the container with id 11f90b0e336a87b5877addec31149928d5b336dec44d068b77dae7916e597959 Apr 16 22:15:52.429655 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.429626 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69fb54c674-n5bgr"] Apr 16 22:15:52.432398 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:52.432366 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5546cbc6_033e_4879_90dc_24493ae81f46.slice/crio-8d7ce31027d50bf88e95eed400c78a260a29556f6514f49204bc820fa56dd098 WatchSource:0}: Error finding container 8d7ce31027d50bf88e95eed400c78a260a29556f6514f49204bc820fa56dd098: Status 404 returned error can't find the container with id 8d7ce31027d50bf88e95eed400c78a260a29556f6514f49204bc820fa56dd098 Apr 16 22:15:52.557214 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.557173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:52.559564 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.559535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee7387b2-e00d-45fd-881b-86e9cd70ab5b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bjs2g\" (UID: \"ee7387b2-e00d-45fd-881b-86e9cd70ab5b\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:52.657993 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.657948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:52.658608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.658587 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e1bbae3-73bb-42df-9209-225120eedbb0-service-ca-bundle\") pod \"router-default-5c7c6f9bb8-k7dmc\" (UID: \"5e1bbae3-73bb-42df-9209-225120eedbb0\") " pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:52.733360 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.733276 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-lmn2g\"" Apr 16 22:15:52.740581 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.740554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" Apr 16 22:15:52.836271 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.836235 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xmhbl\"" Apr 16 22:15:52.845003 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.844954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:52.873098 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.872971 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g"] Apr 16 22:15:52.875662 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:52.875630 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee7387b2_e00d_45fd_881b_86e9cd70ab5b.slice/crio-9699c4da00ced5490a2c2a7d68eb65790df37448da3f0d0cd54608f726101ff2 WatchSource:0}: Error finding container 9699c4da00ced5490a2c2a7d68eb65790df37448da3f0d0cd54608f726101ff2: Status 404 returned error can't find the container with id 9699c4da00ced5490a2c2a7d68eb65790df37448da3f0d0cd54608f726101ff2 Apr 16 22:15:52.940702 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.940631 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" event={"ID":"ee7387b2-e00d-45fd-881b-86e9cd70ab5b","Type":"ContainerStarted","Data":"9699c4da00ced5490a2c2a7d68eb65790df37448da3f0d0cd54608f726101ff2"} Apr 16 22:15:52.942566 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.942242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" event={"ID":"5546cbc6-033e-4879-90dc-24493ae81f46","Type":"ContainerStarted","Data":"c473f75b1f0f087775d73170a8ff90ef8bd79ec3c9106f31dd765b2dff423c62"} Apr 16 22:15:52.942566 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.942289 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" event={"ID":"5546cbc6-033e-4879-90dc-24493ae81f46","Type":"ContainerStarted","Data":"8d7ce31027d50bf88e95eed400c78a260a29556f6514f49204bc820fa56dd098"} Apr 16 22:15:52.942802 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.942677 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:15:52.944142 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.944086 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-th4nh" event={"ID":"85c31b35-d034-451f-b044-035fb08e99c1","Type":"ContainerStarted","Data":"2cac86631b941877b72f66092d1218515a71d1b7baf1d30608c31713ec143215"} Apr 16 22:15:52.944254 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.944149 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-th4nh" event={"ID":"85c31b35-d034-451f-b044-035fb08e99c1","Type":"ContainerStarted","Data":"11f90b0e336a87b5877addec31149928d5b336dec44d068b77dae7916e597959"} Apr 16 22:15:52.962769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.962712 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" podStartSLOduration=1.962691036 podStartE2EDuration="1.962691036s" podCreationTimestamp="2026-04-16 22:15:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:15:52.962509837 +0000 UTC m=+133.187676218" watchObservedRunningTime="2026-04-16 22:15:52.962691036 +0000 UTC m=+133.187857416" Apr 16 22:15:52.984083 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:52.984010 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5c7c6f9bb8-k7dmc"] Apr 16 22:15:52.987823 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:52.987793 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e1bbae3_73bb_42df_9209_225120eedbb0.slice/crio-f64a57f2234caeb36406887ffc224fb3c8d7f8eb854f12db0ece5b09534e21bf WatchSource:0}: Error finding container f64a57f2234caeb36406887ffc224fb3c8d7f8eb854f12db0ece5b09534e21bf: Status 404 returned error can't find the container with id f64a57f2234caeb36406887ffc224fb3c8d7f8eb854f12db0ece5b09534e21bf Apr 16 22:15:53.949580 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:53.949516 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-th4nh" event={"ID":"85c31b35-d034-451f-b044-035fb08e99c1","Type":"ContainerStarted","Data":"7a3036ba1e4343dff2a50b126c3bc8541b48009ac9609b56600e65c208c949e0"} Apr 16 22:15:53.951802 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:53.951760 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" event={"ID":"5e1bbae3-73bb-42df-9209-225120eedbb0","Type":"ContainerStarted","Data":"a73d4515df6830d25a61f41c45a0d80b868dca02ebd78096815a7bbbb3139c95"} Apr 16 22:15:53.951802 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:53.951809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" event={"ID":"5e1bbae3-73bb-42df-9209-225120eedbb0","Type":"ContainerStarted","Data":"f64a57f2234caeb36406887ffc224fb3c8d7f8eb854f12db0ece5b09534e21bf"} Apr 16 22:15:53.970704 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:53.970562 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" podStartSLOduration=33.970538927 podStartE2EDuration="33.970538927s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:15:53.969254822 +0000 UTC m=+134.194421203" watchObservedRunningTime="2026-04-16 22:15:53.970538927 +0000 UTC m=+134.195705308" Apr 16 22:15:54.575510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:54.575469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:54.578343 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:54.578304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/312dcba8-3e17-48d7-9fb2-06758dcd295e-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r575f\" (UID: \"312dcba8-3e17-48d7-9fb2-06758dcd295e\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:54.773046 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:54.773012 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-wmljm\"" Apr 16 22:15:54.781494 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:54.781464 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" Apr 16 22:15:54.845460 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:54.845376 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:54.848524 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:54.848496 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:54.955377 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:54.955347 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:54.956712 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:54.956690 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5c7c6f9bb8-k7dmc" Apr 16 22:15:55.059873 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:55.059851 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-r575f"] Apr 16 22:15:55.063787 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:15:55.063746 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod312dcba8_3e17_48d7_9fb2_06758dcd295e.slice/crio-c687d5ea38a405553cc376c9d04b2ae37759f6e20f7c09746d46acda0a72dbd1 WatchSource:0}: Error finding container c687d5ea38a405553cc376c9d04b2ae37759f6e20f7c09746d46acda0a72dbd1: Status 404 returned error can't find the container with id c687d5ea38a405553cc376c9d04b2ae37759f6e20f7c09746d46acda0a72dbd1 Apr 16 22:15:55.959582 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:55.959541 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" event={"ID":"312dcba8-3e17-48d7-9fb2-06758dcd295e","Type":"ContainerStarted","Data":"c687d5ea38a405553cc376c9d04b2ae37759f6e20f7c09746d46acda0a72dbd1"} Apr 16 22:15:55.961040 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:55.960999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" event={"ID":"ee7387b2-e00d-45fd-881b-86e9cd70ab5b","Type":"ContainerStarted","Data":"efdab7623b7a33e0e8abbcd2159dd1bdc41894e4ade67ca668c516d2cacda0ca"} Apr 16 22:15:55.963498 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:55.963157 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-th4nh" event={"ID":"85c31b35-d034-451f-b044-035fb08e99c1","Type":"ContainerStarted","Data":"968ba271a1cc98e2e266fc14a19db2e36f59433f60c217fb08b00b6ff7640dda"} Apr 16 22:15:55.980149 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:55.980064 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bjs2g" podStartSLOduration=33.930521079 podStartE2EDuration="35.980042092s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="2026-04-16 22:15:52.877512188 +0000 UTC m=+133.102678546" lastFinishedPulling="2026-04-16 22:15:54.927033194 +0000 UTC m=+135.152199559" observedRunningTime="2026-04-16 22:15:55.977677048 +0000 UTC m=+136.202843428" watchObservedRunningTime="2026-04-16 22:15:55.980042092 +0000 UTC m=+136.205208474" Apr 16 22:15:55.997877 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:55.997808 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-th4nh" podStartSLOduration=2.5117182270000002 podStartE2EDuration="4.997791418s" podCreationTimestamp="2026-04-16 22:15:51 +0000 UTC" firstStartedPulling="2026-04-16 22:15:52.435684379 +0000 UTC m=+132.660850737" lastFinishedPulling="2026-04-16 22:15:54.92175756 +0000 UTC m=+135.146923928" observedRunningTime="2026-04-16 22:15:55.995974743 +0000 UTC m=+136.221141126" watchObservedRunningTime="2026-04-16 22:15:55.997791418 +0000 UTC m=+136.222957797" Apr 16 22:15:56.966690 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:56.966640 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" event={"ID":"312dcba8-3e17-48d7-9fb2-06758dcd295e","Type":"ContainerStarted","Data":"004ce3f209db0749fe0cf56686b4cfc8237de546f9f7ab7d444693c11b47ad17"} Apr 16 22:15:56.983221 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:15:56.983166 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r575f" podStartSLOduration=33.849993078 podStartE2EDuration="34.98315017s" podCreationTimestamp="2026-04-16 22:15:22 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.065731482 +0000 UTC m=+135.290897840" lastFinishedPulling="2026-04-16 22:15:56.198888558 +0000 UTC m=+136.424054932" observedRunningTime="2026-04-16 22:15:56.982904671 +0000 UTC m=+137.208071052" watchObservedRunningTime="2026-04-16 22:15:56.98315017 +0000 UTC m=+137.208316551" Apr 16 22:16:03.961425 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.961382 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd"] Apr 16 22:16:03.967086 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.967057 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:03.973023 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.972993 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-blgr5\"" Apr 16 22:16:03.974184 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.974163 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 22:16:03.974354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.974202 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:16:03.974418 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.974229 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:03.982691 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.982660 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd"] Apr 16 22:16:03.991607 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.991575 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xlf7l"] Apr 16 22:16:03.995325 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:03.995299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.002406 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.002093 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:04.002406 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.002129 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4vs7f\"" Apr 16 22:16:04.002406 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.002203 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:04.002406 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.002247 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:04.061292 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcmwr\" (UniqueName: \"kubernetes.io/projected/3b2bdd50-6512-49b4-bf64-f9c22a868568-kube-api-access-tcmwr\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061292 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061292 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2bdd50-6512-49b4-bf64-f9c22a868568-metrics-client-ca\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-wtmp\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-sys\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-root\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-tls\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ff4f32e-7232-4cba-8429-a746bb63a2a3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.061526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ff4f32e-7232-4cba-8429-a746bb63a2a3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.061786 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-textfile\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061786 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061593 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-accelerators-collector-config\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.061786 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff4f32e-7232-4cba-8429-a746bb63a2a3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.061786 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.061648 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp4k2\" (UniqueName: \"kubernetes.io/projected/8ff4f32e-7232-4cba-8429-a746bb63a2a3-kube-api-access-gp4k2\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.162252 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-root\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.162454 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162263 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-tls\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.162454 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ff4f32e-7232-4cba-8429-a746bb63a2a3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.162454 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-root\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.162454 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162325 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ff4f32e-7232-4cba-8429-a746bb63a2a3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.162454 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-textfile\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.162454 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:16:04.162425 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:04.162768 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162457 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-accelerators-collector-config\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.162768 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:16:04.162491 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-tls podName:3b2bdd50-6512-49b4-bf64-f9c22a868568 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:04.662472337 +0000 UTC m=+144.887638708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-tls") pod "node-exporter-xlf7l" (UID: "3b2bdd50-6512-49b4-bf64-f9c22a868568") : secret "node-exporter-tls" not found Apr 16 22:16:04.162768 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff4f32e-7232-4cba-8429-a746bb63a2a3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.162768 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gp4k2\" (UniqueName: \"kubernetes.io/projected/8ff4f32e-7232-4cba-8429-a746bb63a2a3-kube-api-access-gp4k2\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.162768 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcmwr\" (UniqueName: \"kubernetes.io/projected/3b2bdd50-6512-49b4-bf64-f9c22a868568-kube-api-access-tcmwr\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163029 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2bdd50-6512-49b4-bf64-f9c22a868568-metrics-client-ca\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163029 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-wtmp\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163029 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-textfile\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163029 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163029 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-sys\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163029 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.162991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-sys\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.163152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-wtmp\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.163163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-accelerators-collector-config\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.163354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.163205 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ff4f32e-7232-4cba-8429-a746bb63a2a3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.163354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.163333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2bdd50-6512-49b4-bf64-f9c22a868568-metrics-client-ca\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.165226 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.165195 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ff4f32e-7232-4cba-8429-a746bb63a2a3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.165430 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.165410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.165541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.165520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ff4f32e-7232-4cba-8429-a746bb63a2a3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.199181 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.199098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp4k2\" (UniqueName: \"kubernetes.io/projected/8ff4f32e-7232-4cba-8429-a746bb63a2a3-kube-api-access-gp4k2\") pod \"openshift-state-metrics-9d44df66c-w9dhd\" (UID: \"8ff4f32e-7232-4cba-8429-a746bb63a2a3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.207280 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.207245 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcmwr\" (UniqueName: \"kubernetes.io/projected/3b2bdd50-6512-49b4-bf64-f9c22a868568-kube-api-access-tcmwr\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.278810 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.278688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" Apr 16 22:16:04.430842 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.430777 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd"] Apr 16 22:16:04.433417 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:16:04.433389 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ff4f32e_7232_4cba_8429_a746bb63a2a3.slice/crio-5416c5cc70a9ae25e28877691e9a7ea5a9ad468596c3fc771b95c276eaed0a23 WatchSource:0}: Error finding container 5416c5cc70a9ae25e28877691e9a7ea5a9ad468596c3fc771b95c276eaed0a23: Status 404 returned error can't find the container with id 5416c5cc70a9ae25e28877691e9a7ea5a9ad468596c3fc771b95c276eaed0a23 Apr 16 22:16:04.667296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.667264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-tls\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.669529 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.669507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3b2bdd50-6512-49b4-bf64-f9c22a868568-node-exporter-tls\") pod \"node-exporter-xlf7l\" (UID: \"3b2bdd50-6512-49b4-bf64-f9c22a868568\") " pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.905728 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.905641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xlf7l" Apr 16 22:16:04.915809 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:16:04.915774 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b2bdd50_6512_49b4_bf64_f9c22a868568.slice/crio-69cf972b8f7212ecb2598b61668bd2ae0a9607416b0f3415d5e54176333426d0 WatchSource:0}: Error finding container 69cf972b8f7212ecb2598b61668bd2ae0a9607416b0f3415d5e54176333426d0: Status 404 returned error can't find the container with id 69cf972b8f7212ecb2598b61668bd2ae0a9607416b0f3415d5e54176333426d0 Apr 16 22:16:04.989963 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.989921 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xlf7l" event={"ID":"3b2bdd50-6512-49b4-bf64-f9c22a868568","Type":"ContainerStarted","Data":"69cf972b8f7212ecb2598b61668bd2ae0a9607416b0f3415d5e54176333426d0"} Apr 16 22:16:04.991577 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.991547 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" event={"ID":"8ff4f32e-7232-4cba-8429-a746bb63a2a3","Type":"ContainerStarted","Data":"6f1de22bf442ff90067fbc5b1dc12fe6044fc5bdf1f42fb3a8312aa60477834d"} Apr 16 22:16:04.991577 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.991584 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" event={"ID":"8ff4f32e-7232-4cba-8429-a746bb63a2a3","Type":"ContainerStarted","Data":"10d88df5b054bf61214cc3f6f354ef66df3f72886ed1ad1d278644079ac9fd61"} Apr 16 22:16:04.991712 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:04.991597 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" event={"ID":"8ff4f32e-7232-4cba-8429-a746bb63a2a3","Type":"ContainerStarted","Data":"5416c5cc70a9ae25e28877691e9a7ea5a9ad468596c3fc771b95c276eaed0a23"} Apr 16 22:16:07.002129 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.001697 2572 generic.go:358] "Generic (PLEG): container finished" podID="3b2bdd50-6512-49b4-bf64-f9c22a868568" containerID="0c08cdb9099bea9f195a2f712090d0fe5c2914ca3babeaf188a9b495d295ca57" exitCode=0 Apr 16 22:16:07.002129 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.001751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xlf7l" event={"ID":"3b2bdd50-6512-49b4-bf64-f9c22a868568","Type":"ContainerDied","Data":"0c08cdb9099bea9f195a2f712090d0fe5c2914ca3babeaf188a9b495d295ca57"} Apr 16 22:16:07.003869 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.003840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" event={"ID":"8ff4f32e-7232-4cba-8429-a746bb63a2a3","Type":"ContainerStarted","Data":"2159aa4ca211b8c070414e757e1d689c6379ca6df7e8377a047db111ad245b4e"} Apr 16 22:16:07.077031 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.076944 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-w9dhd" podStartSLOduration=2.69455919 podStartE2EDuration="4.076922906s" podCreationTimestamp="2026-04-16 22:16:03 +0000 UTC" firstStartedPulling="2026-04-16 22:16:04.559460852 +0000 UTC m=+144.784627211" lastFinishedPulling="2026-04-16 22:16:05.941824565 +0000 UTC m=+146.166990927" observedRunningTime="2026-04-16 22:16:07.073196723 +0000 UTC m=+147.298363103" watchObservedRunningTime="2026-04-16 22:16:07.076922906 +0000 UTC m=+147.302089287" Apr 16 22:16:07.208407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.208365 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5fbd9767d-rjqql"] Apr 16 22:16:07.212042 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.212024 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.216646 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.216626 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 22:16:07.220516 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.220492 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 22:16:07.220516 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.220492 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 22:16:07.220731 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.220581 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 22:16:07.220731 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.220633 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wkpjn\"" Apr 16 22:16:07.222874 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.222856 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 22:16:07.224543 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.224527 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-urtnnsujfrk\"" Apr 16 22:16:07.246228 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.246195 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5fbd9767d-rjqql"] Apr 16 22:16:07.289091 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.289058 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-grpc-tls\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.289290 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.289127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.289290 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.289156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-metrics-client-ca\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.289290 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.289194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg9lb\" (UniqueName: \"kubernetes.io/projected/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-kube-api-access-lg9lb\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.289290 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.289220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-tls\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.289290 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.289246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.289290 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.289287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.289595 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.289336 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.390778 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.390733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.390959 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.390795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.390959 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.390818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-grpc-tls\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.390959 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.390848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.390959 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.390875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-metrics-client-ca\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.391224 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.391191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg9lb\" (UniqueName: \"kubernetes.io/projected/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-kube-api-access-lg9lb\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.391378 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.391249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-tls\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.391378 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.391280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.391915 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.391887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-metrics-client-ca\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.393760 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.393733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.393858 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.393759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-grpc-tls\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.393858 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.393811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.394061 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.394042 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.394149 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.394083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-tls\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.394265 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.394248 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.405237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.405207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg9lb\" (UniqueName: \"kubernetes.io/projected/dd94387b-2cfc-4696-8aa0-7cfe19572cf7-kube-api-access-lg9lb\") pod \"thanos-querier-5fbd9767d-rjqql\" (UID: \"dd94387b-2cfc-4696-8aa0-7cfe19572cf7\") " pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.521367 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.521333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:07.667599 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:07.667565 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5fbd9767d-rjqql"] Apr 16 22:16:07.672351 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:16:07.672317 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd94387b_2cfc_4696_8aa0_7cfe19572cf7.slice/crio-ecd2dcf20ca8479470997a7ffc05d1413dc41e36094905be490e3b4f56ab1abf WatchSource:0}: Error finding container ecd2dcf20ca8479470997a7ffc05d1413dc41e36094905be490e3b4f56ab1abf: Status 404 returned error can't find the container with id ecd2dcf20ca8479470997a7ffc05d1413dc41e36094905be490e3b4f56ab1abf Apr 16 22:16:08.008891 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.008792 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xlf7l" event={"ID":"3b2bdd50-6512-49b4-bf64-f9c22a868568","Type":"ContainerStarted","Data":"e3c502948979f81aadbdb30ff025bdf72e8483f555813cbc6813d3a6837897e3"} Apr 16 22:16:08.008891 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.008831 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xlf7l" event={"ID":"3b2bdd50-6512-49b4-bf64-f9c22a868568","Type":"ContainerStarted","Data":"897bd04f5757bb90e9ed87fa036369c429ee853a8cd38e70a197136b448d588a"} Apr 16 22:16:08.009892 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.009867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" event={"ID":"dd94387b-2cfc-4696-8aa0-7cfe19572cf7","Type":"ContainerStarted","Data":"ecd2dcf20ca8479470997a7ffc05d1413dc41e36094905be490e3b4f56ab1abf"} Apr 16 22:16:08.832923 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.832870 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xlf7l" podStartSLOduration=4.8069652739999995 podStartE2EDuration="5.832852986s" podCreationTimestamp="2026-04-16 22:16:03 +0000 UTC" firstStartedPulling="2026-04-16 22:16:04.917548453 +0000 UTC m=+145.142714811" lastFinishedPulling="2026-04-16 22:16:05.943436164 +0000 UTC m=+146.168602523" observedRunningTime="2026-04-16 22:16:08.047302178 +0000 UTC m=+148.272468558" watchObservedRunningTime="2026-04-16 22:16:08.832852986 +0000 UTC m=+149.058019392" Apr 16 22:16:08.834557 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.834526 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-67b66ff5c-b8v8q"] Apr 16 22:16:08.837780 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.837760 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:08.848311 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.848285 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 22:16:08.849237 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.849220 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:16:08.849310 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.849221 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 22:16:08.849354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.849221 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-nv2hk\"" Apr 16 22:16:08.849572 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.849551 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 22:16:08.849702 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.849640 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7si1k5b0flj8s\"" Apr 16 22:16:08.864835 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.864801 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67b66ff5c-b8v8q"] Apr 16 22:16:08.906649 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.906606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-client-ca-bundle\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:08.906649 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.906651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-metrics-server-audit-profiles\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:08.906908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.906675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-secret-metrics-server-tls\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:08.906908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.906716 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mq7\" (UniqueName: \"kubernetes.io/projected/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-kube-api-access-k9mq7\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:08.906908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.906749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-audit-log\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:08.906908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.906766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-secret-metrics-server-client-certs\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:08.906908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:08.906867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.007967 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.007928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-audit-log\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.008186 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.007976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-secret-metrics-server-client-certs\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.008186 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.008028 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.008186 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.008089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-client-ca-bundle\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.008186 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.008138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-metrics-server-audit-profiles\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.008186 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.008170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-secret-metrics-server-tls\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.008435 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.008218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mq7\" (UniqueName: \"kubernetes.io/projected/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-kube-api-access-k9mq7\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.008486 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.008445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-audit-log\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.008902 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.008872 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.009395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.009363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-metrics-server-audit-profiles\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.010960 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.010930 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-secret-metrics-server-client-certs\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.011341 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.011317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-client-ca-bundle\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.011429 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.011318 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-secret-metrics-server-tls\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.021630 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.021602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mq7\" (UniqueName: \"kubernetes.io/projected/239bdbcf-4bb5-4a84-b8ff-6bf0e532f445-kube-api-access-k9mq7\") pod \"metrics-server-67b66ff5c-b8v8q\" (UID: \"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445\") " pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.147595 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.147501 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:09.645149 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:09.645125 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67b66ff5c-b8v8q"] Apr 16 22:16:09.647302 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:16:09.647270 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239bdbcf_4bb5_4a84_b8ff_6bf0e532f445.slice/crio-5561b8cc4ec8a7c8565b32e9bd9a3a14a49595a07baff38d53bcb1f147c6bbb8 WatchSource:0}: Error finding container 5561b8cc4ec8a7c8565b32e9bd9a3a14a49595a07baff38d53bcb1f147c6bbb8: Status 404 returned error can't find the container with id 5561b8cc4ec8a7c8565b32e9bd9a3a14a49595a07baff38d53bcb1f147c6bbb8 Apr 16 22:16:10.016256 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:10.016216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" event={"ID":"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445","Type":"ContainerStarted","Data":"5561b8cc4ec8a7c8565b32e9bd9a3a14a49595a07baff38d53bcb1f147c6bbb8"} Apr 16 22:16:10.018179 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:10.018142 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" event={"ID":"dd94387b-2cfc-4696-8aa0-7cfe19572cf7","Type":"ContainerStarted","Data":"522dcafddc0e27a0e8fad1409a70b256d6002e618031525159d2e92953b57c74"} Apr 16 22:16:10.018179 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:10.018179 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" event={"ID":"dd94387b-2cfc-4696-8aa0-7cfe19572cf7","Type":"ContainerStarted","Data":"8b6783330b8f089537fde4e865e2f1136a9ef967b06b5972acc2612da8dae744"} Apr 16 22:16:10.018330 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:10.018190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" event={"ID":"dd94387b-2cfc-4696-8aa0-7cfe19572cf7","Type":"ContainerStarted","Data":"9ce2f2ff764ad2ee2679b3ce2067bdd69f8ff99f10731466dfc422c410de7ef6"} Apr 16 22:16:11.024163 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:11.024054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" event={"ID":"dd94387b-2cfc-4696-8aa0-7cfe19572cf7","Type":"ContainerStarted","Data":"fc14aa93723b3d1b4c8fbd1511f54da07998de9eff81b4b776d8838d333e2212"} Apr 16 22:16:11.024163 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:11.024121 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" event={"ID":"dd94387b-2cfc-4696-8aa0-7cfe19572cf7","Type":"ContainerStarted","Data":"f2a675cc48963a273184e3bb81ce9b8b6dff38b5fe003627538e64b1d57dc6f6"} Apr 16 22:16:12.029253 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:12.029207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" event={"ID":"dd94387b-2cfc-4696-8aa0-7cfe19572cf7","Type":"ContainerStarted","Data":"019d64ee354a9c6dd1d30314dcf28852052e1f19489dce8aa4881c55f8fd390e"} Apr 16 22:16:12.029691 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:12.029406 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:12.030599 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:12.030574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" event={"ID":"239bdbcf-4bb5-4a84-b8ff-6bf0e532f445","Type":"ContainerStarted","Data":"e9fd27b75a086b225c375e2edf9cf0fcd44072385b22b769480f687e1692e0ff"} Apr 16 22:16:12.054963 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:12.054911 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" podStartSLOduration=2.134441443 podStartE2EDuration="5.054894091s" podCreationTimestamp="2026-04-16 22:16:07 +0000 UTC" firstStartedPulling="2026-04-16 22:16:07.674434153 +0000 UTC m=+147.899600511" lastFinishedPulling="2026-04-16 22:16:10.594886797 +0000 UTC m=+150.820053159" observedRunningTime="2026-04-16 22:16:12.053712391 +0000 UTC m=+152.278878773" watchObservedRunningTime="2026-04-16 22:16:12.054894091 +0000 UTC m=+152.280060470" Apr 16 22:16:12.071924 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:12.071868 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" podStartSLOduration=2.536680476 podStartE2EDuration="4.071853285s" podCreationTimestamp="2026-04-16 22:16:08 +0000 UTC" firstStartedPulling="2026-04-16 22:16:09.649537865 +0000 UTC m=+149.874704224" lastFinishedPulling="2026-04-16 22:16:11.184710675 +0000 UTC m=+151.409877033" observedRunningTime="2026-04-16 22:16:12.070571648 +0000 UTC m=+152.295738037" watchObservedRunningTime="2026-04-16 22:16:12.071853285 +0000 UTC m=+152.297019664" Apr 16 22:16:12.298830 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:12.298789 2572 patch_prober.go:28] interesting pod/image-registry-69fb54c674-n5bgr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:12.299017 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:12.298863 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" podUID="5546cbc6-033e-4879-90dc-24493ae81f46" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:13.956652 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:13.956614 2572 patch_prober.go:28] interesting pod/image-registry-69fb54c674-n5bgr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:13.957044 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:13.956685 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" podUID="5546cbc6-033e-4879-90dc-24493ae81f46" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:15.145468 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:16:15.145418 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mf4m2" podUID="2357d8ff-11e8-4af7-a6de-7223d6294ade" Apr 16 22:16:15.161584 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:16:15.161533 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tbnzs" podUID="fd83defb-549f-4b35-96b4-9fe5c34c7ebf" Apr 16 22:16:16.042923 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:16.042889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mf4m2" Apr 16 22:16:18.039018 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:18.038987 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5fbd9767d-rjqql" Apr 16 22:16:20.112780 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:20.112740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:16:20.113411 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:20.112822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:16:20.115224 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:20.115198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2357d8ff-11e8-4af7-a6de-7223d6294ade-metrics-tls\") pod \"dns-default-mf4m2\" (UID: \"2357d8ff-11e8-4af7-a6de-7223d6294ade\") " pod="openshift-dns/dns-default-mf4m2" Apr 16 22:16:20.115374 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:20.115353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd83defb-549f-4b35-96b4-9fe5c34c7ebf-cert\") pod \"ingress-canary-tbnzs\" (UID: \"fd83defb-549f-4b35-96b4-9fe5c34c7ebf\") " pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:16:20.246768 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:20.246736 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nlsdl\"" Apr 16 22:16:20.255008 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:20.254970 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mf4m2" Apr 16 22:16:20.399380 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:20.399303 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mf4m2"] Apr 16 22:16:20.402833 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:16:20.402805 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2357d8ff_11e8_4af7_a6de_7223d6294ade.slice/crio-29a4289fdb8911682770be727d00eb4f212e656db2a3df86b74c79dfd4a50c17 WatchSource:0}: Error finding container 29a4289fdb8911682770be727d00eb4f212e656db2a3df86b74c79dfd4a50c17: Status 404 returned error can't find the container with id 29a4289fdb8911682770be727d00eb4f212e656db2a3df86b74c79dfd4a50c17 Apr 16 22:16:21.057583 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:21.057534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mf4m2" event={"ID":"2357d8ff-11e8-4af7-a6de-7223d6294ade","Type":"ContainerStarted","Data":"29a4289fdb8911682770be727d00eb4f212e656db2a3df86b74c79dfd4a50c17"} Apr 16 22:16:22.062264 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:22.062226 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mf4m2" event={"ID":"2357d8ff-11e8-4af7-a6de-7223d6294ade","Type":"ContainerStarted","Data":"64a70185211c9a8eeb0fd797b499e9712c2ddf84fc339ba56a16efd3db1a0062"} Apr 16 22:16:22.062264 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:22.062263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mf4m2" event={"ID":"2357d8ff-11e8-4af7-a6de-7223d6294ade","Type":"ContainerStarted","Data":"caa1e503231549f6967dd13c80d14e3df36cabf55695915c423df1aee461f811"} Apr 16 22:16:22.062802 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:22.062296 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mf4m2" Apr 16 22:16:22.086346 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:22.086298 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mf4m2" podStartSLOduration=128.857920107 podStartE2EDuration="2m10.08628304s" podCreationTimestamp="2026-04-16 22:14:12 +0000 UTC" firstStartedPulling="2026-04-16 22:16:20.40461217 +0000 UTC m=+160.629778527" lastFinishedPulling="2026-04-16 22:16:21.632975098 +0000 UTC m=+161.858141460" observedRunningTime="2026-04-16 22:16:22.082142811 +0000 UTC m=+162.307309188" watchObservedRunningTime="2026-04-16 22:16:22.08628304 +0000 UTC m=+162.311449461" Apr 16 22:16:22.297614 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:22.297574 2572 patch_prober.go:28] interesting pod/image-registry-69fb54c674-n5bgr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:22.297807 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:22.297634 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" podUID="5546cbc6-033e-4879-90dc-24493ae81f46" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:23.955563 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:23.955523 2572 patch_prober.go:28] interesting pod/image-registry-69fb54c674-n5bgr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:23.955960 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:23.955584 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" podUID="5546cbc6-033e-4879-90dc-24493ae81f46" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:27.439784 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:27.439736 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:16:27.442923 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:27.442899 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-skfrk\"" Apr 16 22:16:27.450462 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:27.450438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tbnzs" Apr 16 22:16:27.580728 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:27.580703 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tbnzs"] Apr 16 22:16:27.582291 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:16:27.582257 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd83defb_549f_4b35_96b4_9fe5c34c7ebf.slice/crio-e29095d446afcec0cf0c8a429189555f5c29f5a791b6b1988999d0f10529589a WatchSource:0}: Error finding container e29095d446afcec0cf0c8a429189555f5c29f5a791b6b1988999d0f10529589a: Status 404 returned error can't find the container with id e29095d446afcec0cf0c8a429189555f5c29f5a791b6b1988999d0f10529589a Apr 16 22:16:28.084092 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:28.084050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tbnzs" event={"ID":"fd83defb-549f-4b35-96b4-9fe5c34c7ebf","Type":"ContainerStarted","Data":"e29095d446afcec0cf0c8a429189555f5c29f5a791b6b1988999d0f10529589a"} Apr 16 22:16:29.148433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:29.148388 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:29.148919 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:29.148445 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:30.091966 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:30.091933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tbnzs" event={"ID":"fd83defb-549f-4b35-96b4-9fe5c34c7ebf","Type":"ContainerStarted","Data":"97f7f47b0849f15d8821526da772ce68ef68b58706abf2d2fcd6225b0d4cacb1"} Apr 16 22:16:30.111120 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:30.111002 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tbnzs" podStartSLOduration=135.839621154 podStartE2EDuration="2m18.110987693s" podCreationTimestamp="2026-04-16 22:14:12 +0000 UTC" firstStartedPulling="2026-04-16 22:16:27.584151271 +0000 UTC m=+167.809317628" lastFinishedPulling="2026-04-16 22:16:29.855517806 +0000 UTC m=+170.080684167" observedRunningTime="2026-04-16 22:16:30.109393542 +0000 UTC m=+170.334559933" watchObservedRunningTime="2026-04-16 22:16:30.110987693 +0000 UTC m=+170.336154101" Apr 16 22:16:32.070223 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:32.070191 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mf4m2" Apr 16 22:16:32.297587 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:32.297552 2572 patch_prober.go:28] interesting pod/image-registry-69fb54c674-n5bgr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:32.297751 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:32.297616 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" podUID="5546cbc6-033e-4879-90dc-24493ae81f46" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:32.297751 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:32.297653 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:16:32.298230 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:32.298187 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"c473f75b1f0f087775d73170a8ff90ef8bd79ec3c9106f31dd765b2dff423c62"} pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" containerMessage="Container registry failed liveness probe, will be restarted" Apr 16 22:16:32.301433 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:32.301408 2572 patch_prober.go:28] interesting pod/image-registry-69fb54c674-n5bgr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:32.301535 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:32.301453 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" podUID="5546cbc6-033e-4879-90dc-24493ae81f46" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:42.302391 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:42.302364 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:16:46.141244 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:46.141207 2572 generic.go:358] "Generic (PLEG): container finished" podID="64846700-7b7f-4273-b209-6b6a814c538e" containerID="e043bfcfb3eaa6ffa1222dd592423bd46210bbc5118924bde42ed66f93a1a2b8" exitCode=0 Apr 16 22:16:46.141616 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:46.141260 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" event={"ID":"64846700-7b7f-4273-b209-6b6a814c538e","Type":"ContainerDied","Data":"e043bfcfb3eaa6ffa1222dd592423bd46210bbc5118924bde42ed66f93a1a2b8"} Apr 16 22:16:46.141616 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:46.141586 2572 scope.go:117] "RemoveContainer" containerID="e043bfcfb3eaa6ffa1222dd592423bd46210bbc5118924bde42ed66f93a1a2b8" Apr 16 22:16:47.145721 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:47.145682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-nkvrv" event={"ID":"64846700-7b7f-4273-b209-6b6a814c538e","Type":"ContainerStarted","Data":"518d793f93c6b37ba9b803d311ed6f6e78302f323c1ad119fd38748866da6014"} Apr 16 22:16:49.153256 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:49.153227 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:49.157206 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:49.157185 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-67b66ff5c-b8v8q" Apr 16 22:16:51.161137 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:51.161089 2572 generic.go:358] "Generic (PLEG): container finished" podID="8642ce80-b932-4245-8f5a-3ce2e6014659" containerID="e3a4f135d67c55fc33f427b4d0172d2601654fc193db30b62efed7a278108896" exitCode=0 Apr 16 22:16:51.161503 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:51.161164 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2qjf4" event={"ID":"8642ce80-b932-4245-8f5a-3ce2e6014659","Type":"ContainerDied","Data":"e3a4f135d67c55fc33f427b4d0172d2601654fc193db30b62efed7a278108896"} Apr 16 22:16:51.161564 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:51.161547 2572 scope.go:117] "RemoveContainer" containerID="e3a4f135d67c55fc33f427b4d0172d2601654fc193db30b62efed7a278108896" Apr 16 22:16:52.165756 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:52.165710 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2qjf4" event={"ID":"8642ce80-b932-4245-8f5a-3ce2e6014659","Type":"ContainerStarted","Data":"f1919bd5e888ab7f0ae9671babac23db9851b75b10263a9b27678d9bf7e2839a"} Apr 16 22:16:52.186255 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:52.186224 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c7c6f9bb8-k7dmc_5e1bbae3-73bb-42df-9209-225120eedbb0/router/0.log" Apr 16 22:16:52.208916 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:52.208887 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tbnzs_fd83defb-549f-4b35-96b4-9fe5c34c7ebf/serve-healthcheck-canary/0.log" Apr 16 22:16:56.178659 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:56.178622 2572 generic.go:358] "Generic (PLEG): container finished" podID="477be877-0af9-4fc8-85b1-24658b74a7a8" containerID="df27a52ca9ddc6093bed8ec13a7e887ae0162e4d284c70b686852f1f9a0d3ffd" exitCode=0 Apr 16 22:16:56.179082 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:56.178695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" event={"ID":"477be877-0af9-4fc8-85b1-24658b74a7a8","Type":"ContainerDied","Data":"df27a52ca9ddc6093bed8ec13a7e887ae0162e4d284c70b686852f1f9a0d3ffd"} Apr 16 22:16:56.179082 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:56.179014 2572 scope.go:117] "RemoveContainer" containerID="df27a52ca9ddc6093bed8ec13a7e887ae0162e4d284c70b686852f1f9a0d3ffd" Apr 16 22:16:57.183475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:57.183441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sblhq" event={"ID":"477be877-0af9-4fc8-85b1-24658b74a7a8","Type":"ContainerStarted","Data":"a857f6fbd45cc141998ff629d645f3d4ee65411f85b3339dfd8ed721872ebe02"} Apr 16 22:16:57.317346 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:57.317297 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" podUID="5546cbc6-033e-4879-90dc-24493ae81f46" containerName="registry" containerID="cri-o://c473f75b1f0f087775d73170a8ff90ef8bd79ec3c9106f31dd765b2dff423c62" gracePeriod=30 Apr 16 22:16:59.192553 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:59.192519 2572 generic.go:358] "Generic (PLEG): container finished" podID="5546cbc6-033e-4879-90dc-24493ae81f46" containerID="c473f75b1f0f087775d73170a8ff90ef8bd79ec3c9106f31dd765b2dff423c62" exitCode=0 Apr 16 22:16:59.192938 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:59.192585 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" event={"ID":"5546cbc6-033e-4879-90dc-24493ae81f46","Type":"ContainerDied","Data":"c473f75b1f0f087775d73170a8ff90ef8bd79ec3c9106f31dd765b2dff423c62"} Apr 16 22:16:59.192938 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:59.192608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" event={"ID":"5546cbc6-033e-4879-90dc-24493ae81f46","Type":"ContainerStarted","Data":"d5945b0d6e95a4d2b5d07ecc2c99d76a2a61f04b93c6b35be36c78f0ef6850d2"} Apr 16 22:16:59.192938 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:16:59.192641 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:17:20.199777 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:17:20.199746 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69fb54c674-n5bgr" Apr 16 22:18:40.335452 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:18:40.335424 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:18:40.335998 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:18:40.335978 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:18:40.338003 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:18:40.337984 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:19:17.855573 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:17.855529 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ktlhf"] Apr 16 22:19:17.859012 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:17.858987 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:17.863518 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:17.863492 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:19:17.870948 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:17.870916 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ktlhf"] Apr 16 22:19:17.909472 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:17.909425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-dbus\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:17.909647 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:17.909479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-original-pull-secret\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:17.909647 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:17.909554 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-kubelet-config\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:18.010446 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.010398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-original-pull-secret\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:18.010446 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.010456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-kubelet-config\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:18.010632 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.010509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-dbus\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:18.010632 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.010599 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-kubelet-config\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:18.010700 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.010643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-dbus\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:18.012759 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.012739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a800d5ed-9725-4dd0-a59c-eab57f01a3bc-original-pull-secret\") pod \"global-pull-secret-syncer-ktlhf\" (UID: \"a800d5ed-9725-4dd0-a59c-eab57f01a3bc\") " pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:18.167937 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.167846 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktlhf" Apr 16 22:19:18.287297 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.287271 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ktlhf"] Apr 16 22:19:18.289225 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:19:18.289197 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda800d5ed_9725_4dd0_a59c_eab57f01a3bc.slice/crio-820fa3ffe218fc0b0007c806f28ff5b5fb14c45f3ab731a2cc1f693e4939a4d8 WatchSource:0}: Error finding container 820fa3ffe218fc0b0007c806f28ff5b5fb14c45f3ab731a2cc1f693e4939a4d8: Status 404 returned error can't find the container with id 820fa3ffe218fc0b0007c806f28ff5b5fb14c45f3ab731a2cc1f693e4939a4d8 Apr 16 22:19:18.296057 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.296036 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:19:18.607251 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:18.607217 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ktlhf" event={"ID":"a800d5ed-9725-4dd0-a59c-eab57f01a3bc","Type":"ContainerStarted","Data":"820fa3ffe218fc0b0007c806f28ff5b5fb14c45f3ab731a2cc1f693e4939a4d8"} Apr 16 22:19:22.621356 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:22.621318 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ktlhf" event={"ID":"a800d5ed-9725-4dd0-a59c-eab57f01a3bc","Type":"ContainerStarted","Data":"829d20ec8de3276b0b29e6f557dc3d464029fa4086db15b3c4fc031ef7da8156"} Apr 16 22:19:22.638917 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:19:22.638864 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ktlhf" podStartSLOduration=1.984023332 podStartE2EDuration="5.638847664s" podCreationTimestamp="2026-04-16 22:19:17 +0000 UTC" firstStartedPulling="2026-04-16 22:19:18.296187208 +0000 UTC m=+338.521353569" lastFinishedPulling="2026-04-16 22:19:21.951011544 +0000 UTC m=+342.176177901" observedRunningTime="2026-04-16 22:19:22.637402933 +0000 UTC m=+342.862569313" watchObservedRunningTime="2026-04-16 22:19:22.638847664 +0000 UTC m=+342.864014044" Apr 16 22:20:04.017937 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.017896 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ghbds"] Apr 16 22:20:04.020250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.020229 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.024295 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.024268 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 22:20:04.024622 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.024604 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 22:20:04.024730 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.024640 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 22:20:04.025581 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.025564 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 22:20:04.025673 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.025586 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-zxc22\"" Apr 16 22:20:04.025739 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.025668 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 22:20:04.037722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.037691 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ghbds"] Apr 16 22:20:04.206364 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.206320 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-cabundle0\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.206540 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.206386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzl4w\" (UniqueName: \"kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-kube-api-access-vzl4w\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.206540 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.206445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.307406 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.307372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-cabundle0\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.307605 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.307423 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzl4w\" (UniqueName: \"kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-kube-api-access-vzl4w\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.307605 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.307452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.307605 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.307563 2572 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:20:04.307605 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.307574 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:20:04.307605 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.307582 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ghbds: references non-existent secret key: ca.crt Apr 16 22:20:04.307872 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.307640 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates podName:fed1d227-a7b0-4ca3-89ab-0a7e10a58d90 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:04.807625575 +0000 UTC m=+385.032791933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates") pod "keda-operator-ffbb595cb-ghbds" (UID: "fed1d227-a7b0-4ca3-89ab-0a7e10a58d90") : references non-existent secret key: ca.crt Apr 16 22:20:04.308138 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.308098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-cabundle0\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.318729 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.318700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzl4w\" (UniqueName: \"kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-kube-api-access-vzl4w\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.368521 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.368482 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6"] Apr 16 22:20:04.370987 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.370969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.373598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.373571 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 22:20:04.379805 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.379780 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6"] Apr 16 22:20:04.509044 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.508998 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.509250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.509081 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.509250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.509154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b28vt\" (UniqueName: \"kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-kube-api-access-b28vt\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.610604 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.610504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.610604 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.610593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.610798 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.610655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b28vt\" (UniqueName: \"kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-kube-api-access-b28vt\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.610798 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.610723 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:20:04.610798 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.610745 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:20:04.610798 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.610763 2572 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 22:20:04.610798 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.610782 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 22:20:04.610963 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.610851 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates podName:aa9ff6f3-2a4b-4153-b466-1d552b2f438d nodeName:}" failed. No retries permitted until 2026-04-16 22:20:05.110833288 +0000 UTC m=+385.335999651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates") pod "keda-metrics-apiserver-7c9f485588-hn9j6" (UID: "aa9ff6f3-2a4b-4153-b466-1d552b2f438d") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 22:20:04.611021 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.610969 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.620321 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.620285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b28vt\" (UniqueName: \"kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-kube-api-access-b28vt\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:04.812187 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:04.812142 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:04.812358 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.812306 2572 secret.go:281] references non-existent secret key: ca.crt Apr 16 22:20:04.812358 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.812331 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 22:20:04.812358 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.812344 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ghbds: references non-existent secret key: ca.crt Apr 16 22:20:04.812459 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:04.812411 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates podName:fed1d227-a7b0-4ca3-89ab-0a7e10a58d90 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:05.812389412 +0000 UTC m=+386.037555774 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates") pod "keda-operator-ffbb595cb-ghbds" (UID: "fed1d227-a7b0-4ca3-89ab-0a7e10a58d90") : references non-existent secret key: ca.crt Apr 16 22:20:05.114648 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:05.114609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:05.115025 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:05.114761 2572 secret.go:281] references non-existent secret key: tls.crt Apr 16 22:20:05.115025 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:05.114783 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 22:20:05.115025 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:05.114802 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6: references non-existent secret key: tls.crt Apr 16 22:20:05.115025 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:20:05.114856 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates podName:aa9ff6f3-2a4b-4153-b466-1d552b2f438d nodeName:}" failed. No retries permitted until 2026-04-16 22:20:06.114840627 +0000 UTC m=+386.340006985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates") pod "keda-metrics-apiserver-7c9f485588-hn9j6" (UID: "aa9ff6f3-2a4b-4153-b466-1d552b2f438d") : references non-existent secret key: tls.crt Apr 16 22:20:05.821198 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:05.821156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:05.823594 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:05.823569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fed1d227-a7b0-4ca3-89ab-0a7e10a58d90-certificates\") pod \"keda-operator-ffbb595cb-ghbds\" (UID: \"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90\") " pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:05.830365 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:05.830341 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:05.961420 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:05.961383 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ghbds"] Apr 16 22:20:05.963835 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:20:05.963810 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed1d227_a7b0_4ca3_89ab_0a7e10a58d90.slice/crio-f3e0ed8f3d5b25894d30e521c2bd1b1e6411ab7d516c75cb6da5b3321d35ec57 WatchSource:0}: Error finding container f3e0ed8f3d5b25894d30e521c2bd1b1e6411ab7d516c75cb6da5b3321d35ec57: Status 404 returned error can't find the container with id f3e0ed8f3d5b25894d30e521c2bd1b1e6411ab7d516c75cb6da5b3321d35ec57 Apr 16 22:20:06.123616 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:06.123521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:06.126185 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:06.126163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/aa9ff6f3-2a4b-4153-b466-1d552b2f438d-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hn9j6\" (UID: \"aa9ff6f3-2a4b-4153-b466-1d552b2f438d\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:06.183126 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:06.183066 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:06.304036 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:06.304000 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6"] Apr 16 22:20:06.305857 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:20:06.305832 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa9ff6f3_2a4b_4153_b466_1d552b2f438d.slice/crio-05d3cce92544850f65ead2e17feac956759e2e94502284bee68e6432d1238879 WatchSource:0}: Error finding container 05d3cce92544850f65ead2e17feac956759e2e94502284bee68e6432d1238879: Status 404 returned error can't find the container with id 05d3cce92544850f65ead2e17feac956759e2e94502284bee68e6432d1238879 Apr 16 22:20:06.757947 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:06.757909 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" event={"ID":"aa9ff6f3-2a4b-4153-b466-1d552b2f438d","Type":"ContainerStarted","Data":"05d3cce92544850f65ead2e17feac956759e2e94502284bee68e6432d1238879"} Apr 16 22:20:06.758846 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:06.758824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ghbds" event={"ID":"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90","Type":"ContainerStarted","Data":"f3e0ed8f3d5b25894d30e521c2bd1b1e6411ab7d516c75cb6da5b3321d35ec57"} Apr 16 22:20:11.777707 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:11.777598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" event={"ID":"aa9ff6f3-2a4b-4153-b466-1d552b2f438d","Type":"ContainerStarted","Data":"c5e0d1a04d1629c97db03af9de1382c048b50f95e167b838a96087f98dbeb0b8"} Apr 16 22:20:11.778158 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:11.777743 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:11.778957 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:11.778936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ghbds" event={"ID":"fed1d227-a7b0-4ca3-89ab-0a7e10a58d90","Type":"ContainerStarted","Data":"0b96d2d7c5b26be0b4ead1f63cfb6e586b99fa07904a21ac623474e54ef10631"} Apr 16 22:20:11.779071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:11.779054 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:20:11.794498 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:11.794427 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" podStartSLOduration=2.801305855 podStartE2EDuration="7.794411859s" podCreationTimestamp="2026-04-16 22:20:04 +0000 UTC" firstStartedPulling="2026-04-16 22:20:06.30742628 +0000 UTC m=+386.532592639" lastFinishedPulling="2026-04-16 22:20:11.300532274 +0000 UTC m=+391.525698643" observedRunningTime="2026-04-16 22:20:11.7930054 +0000 UTC m=+392.018171780" watchObservedRunningTime="2026-04-16 22:20:11.794411859 +0000 UTC m=+392.019578238" Apr 16 22:20:11.809718 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:11.809658 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-ghbds" podStartSLOduration=3.474756073 podStartE2EDuration="8.809636316s" podCreationTimestamp="2026-04-16 22:20:03 +0000 UTC" firstStartedPulling="2026-04-16 22:20:05.965209309 +0000 UTC m=+386.190375666" lastFinishedPulling="2026-04-16 22:20:11.300089532 +0000 UTC m=+391.525255909" observedRunningTime="2026-04-16 22:20:11.808226173 +0000 UTC m=+392.033392563" watchObservedRunningTime="2026-04-16 22:20:11.809636316 +0000 UTC m=+392.034802695" Apr 16 22:20:22.786542 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:22.786512 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hn9j6" Apr 16 22:20:32.784712 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:20:32.784676 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-ghbds" Apr 16 22:21:57.326769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.326732 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4"] Apr 16 22:21:57.329798 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.329781 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.334265 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.334239 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:21:57.334387 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.334255 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 22:21:57.334387 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.334294 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 22:21:57.334387 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.334240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 22:21:57.334542 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.334239 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 22:21:57.334542 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.334217 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-7nvn9\"" Apr 16 22:21:57.340872 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.340846 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4"] Apr 16 22:21:57.459780 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.459747 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpp9z\" (UniqueName: \"kubernetes.io/projected/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-kube-api-access-fpp9z\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.459780 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.459787 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-cert\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.460027 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.459841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-metrics-cert\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.460027 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.459865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-manager-config\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.561189 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.561149 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpp9z\" (UniqueName: \"kubernetes.io/projected/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-kube-api-access-fpp9z\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.561392 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.561226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-cert\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.561392 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.561304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-metrics-cert\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.561392 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.561342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-manager-config\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.562360 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.562336 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-manager-config\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.563940 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.563918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-cert\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.564423 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.564403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-metrics-cert\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.570708 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.570680 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpp9z\" (UniqueName: \"kubernetes.io/projected/6f753f1e-c7b2-4ab6-8588-6a777fcfb68b-kube-api-access-fpp9z\") pod \"lws-controller-manager-5cdcb589b5-787l4\" (UID: \"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.639866 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.639773 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:21:57.762983 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:57.762953 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4"] Apr 16 22:21:57.765778 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:21:57.765751 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f753f1e_c7b2_4ab6_8588_6a777fcfb68b.slice/crio-2c942f17732a22aabf850b27fbd61fec1ae450a247cb3f5abf9d0749b2f04f93 WatchSource:0}: Error finding container 2c942f17732a22aabf850b27fbd61fec1ae450a247cb3f5abf9d0749b2f04f93: Status 404 returned error can't find the container with id 2c942f17732a22aabf850b27fbd61fec1ae450a247cb3f5abf9d0749b2f04f93 Apr 16 22:21:58.114156 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:21:58.114116 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" event={"ID":"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b","Type":"ContainerStarted","Data":"2c942f17732a22aabf850b27fbd61fec1ae450a247cb3f5abf9d0749b2f04f93"} Apr 16 22:22:01.125876 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:01.125840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" event={"ID":"6f753f1e-c7b2-4ab6-8588-6a777fcfb68b","Type":"ContainerStarted","Data":"ece8eb5303015123916c25a576008b04099db115ed2f5c902253e91fc265d5ab"} Apr 16 22:22:01.126290 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:01.125976 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:22:01.143553 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:01.143505 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" podStartSLOduration=1.8261139769999999 podStartE2EDuration="4.143492186s" podCreationTimestamp="2026-04-16 22:21:57 +0000 UTC" firstStartedPulling="2026-04-16 22:21:57.767888029 +0000 UTC m=+497.993054387" lastFinishedPulling="2026-04-16 22:22:00.085266235 +0000 UTC m=+500.310432596" observedRunningTime="2026-04-16 22:22:01.141567211 +0000 UTC m=+501.366733592" watchObservedRunningTime="2026-04-16 22:22:01.143492186 +0000 UTC m=+501.368658566" Apr 16 22:22:12.130849 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:12.130816 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-787l4" Apr 16 22:22:57.361221 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.361181 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd"] Apr 16 22:22:57.364406 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.364390 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.367268 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.367247 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 22:22:57.367563 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.367535 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 22:22:57.367683 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.367566 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 22:22:57.368759 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.368742 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qpntl\"" Apr 16 22:22:57.368832 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.368746 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 22:22:57.375822 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.375799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd"] Apr 16 22:22:57.459788 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.459754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.459788 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.459798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8lkp\" (UniqueName: \"kubernetes.io/projected/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-kube-api-access-d8lkp\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.459993 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.459822 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.560600 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.560563 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.560750 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.560609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8lkp\" (UniqueName: \"kubernetes.io/projected/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-kube-api-access-d8lkp\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.560750 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.560628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.561262 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.561244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.563093 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.563065 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.569299 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.569274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8lkp\" (UniqueName: \"kubernetes.io/projected/a2649eb6-0c6b-4152-8f3b-7cca119c92b2-kube-api-access-d8lkp\") pod \"kuadrant-console-plugin-6c886788f8-rzvrd\" (UID: \"a2649eb6-0c6b-4152-8f3b-7cca119c92b2\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.685289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.685201 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" Apr 16 22:22:57.808853 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:57.808827 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd"] Apr 16 22:22:57.810888 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:22:57.810861 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2649eb6_0c6b_4152_8f3b_7cca119c92b2.slice/crio-ae37ab2f19d5b7b25dd74bf361d8abcd5b80c3e6a7fda8ff0fa2cddd58a0293a WatchSource:0}: Error finding container ae37ab2f19d5b7b25dd74bf361d8abcd5b80c3e6a7fda8ff0fa2cddd58a0293a: Status 404 returned error can't find the container with id ae37ab2f19d5b7b25dd74bf361d8abcd5b80c3e6a7fda8ff0fa2cddd58a0293a Apr 16 22:22:58.304517 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:22:58.304477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" event={"ID":"a2649eb6-0c6b-4152-8f3b-7cca119c92b2","Type":"ContainerStarted","Data":"ae37ab2f19d5b7b25dd74bf361d8abcd5b80c3e6a7fda8ff0fa2cddd58a0293a"} Apr 16 22:23:03.320804 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:03.320761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" event={"ID":"a2649eb6-0c6b-4152-8f3b-7cca119c92b2","Type":"ContainerStarted","Data":"96f2faf44370ca867af6757947ee9b8247ce5a4c9121d96abec8c659de040e21"} Apr 16 22:23:03.365097 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:03.365040 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rzvrd" podStartSLOduration=1.908987834 podStartE2EDuration="6.365025637s" podCreationTimestamp="2026-04-16 22:22:57 +0000 UTC" firstStartedPulling="2026-04-16 22:22:57.812088357 +0000 UTC m=+558.037254718" lastFinishedPulling="2026-04-16 22:23:02.268126147 +0000 UTC m=+562.493292521" observedRunningTime="2026-04-16 22:23:03.363998087 +0000 UTC m=+563.589164468" watchObservedRunningTime="2026-04-16 22:23:03.365025637 +0000 UTC m=+563.590192014" Apr 16 22:23:39.877221 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:39.877183 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-kpr2g"] Apr 16 22:23:39.901847 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:39.901814 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-kpr2g"] Apr 16 22:23:39.902017 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:39.901952 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:39.903521 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:39.903493 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-kpr2g"] Apr 16 22:23:39.904675 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:39.904656 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 22:23:40.013513 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.013470 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwfx\" (UniqueName: \"kubernetes.io/projected/891f6d15-dc70-46a5-b577-177289b5f30d-kube-api-access-hgwfx\") pod \"limitador-limitador-67566c68b4-kpr2g\" (UID: \"891f6d15-dc70-46a5-b577-177289b5f30d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:40.013513 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.013513 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/891f6d15-dc70-46a5-b577-177289b5f30d-config-file\") pod \"limitador-limitador-67566c68b4-kpr2g\" (UID: \"891f6d15-dc70-46a5-b577-177289b5f30d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:40.115044 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.114996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwfx\" (UniqueName: \"kubernetes.io/projected/891f6d15-dc70-46a5-b577-177289b5f30d-kube-api-access-hgwfx\") pod \"limitador-limitador-67566c68b4-kpr2g\" (UID: \"891f6d15-dc70-46a5-b577-177289b5f30d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:40.115044 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.115044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/891f6d15-dc70-46a5-b577-177289b5f30d-config-file\") pod \"limitador-limitador-67566c68b4-kpr2g\" (UID: \"891f6d15-dc70-46a5-b577-177289b5f30d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:40.115650 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.115631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/891f6d15-dc70-46a5-b577-177289b5f30d-config-file\") pod \"limitador-limitador-67566c68b4-kpr2g\" (UID: \"891f6d15-dc70-46a5-b577-177289b5f30d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:40.123769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.123745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwfx\" (UniqueName: \"kubernetes.io/projected/891f6d15-dc70-46a5-b577-177289b5f30d-kube-api-access-hgwfx\") pod \"limitador-limitador-67566c68b4-kpr2g\" (UID: \"891f6d15-dc70-46a5-b577-177289b5f30d\") " pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:40.212250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.212162 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:40.284194 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.284166 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-vcltb"] Apr 16 22:23:40.288912 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.288884 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-vcltb" Apr 16 22:23:40.291668 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.291640 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-qn88w\"" Apr 16 22:23:40.296944 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.296918 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-vcltb"] Apr 16 22:23:40.351958 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.351931 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-kpr2g"] Apr 16 22:23:40.354559 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:23:40.354529 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod891f6d15_dc70_46a5_b577_177289b5f30d.slice/crio-0d62b82cd5f86dd985eb0ed75bf453e8f551831f354c6775058a27ecf6a2979d WatchSource:0}: Error finding container 0d62b82cd5f86dd985eb0ed75bf453e8f551831f354c6775058a27ecf6a2979d: Status 404 returned error can't find the container with id 0d62b82cd5f86dd985eb0ed75bf453e8f551831f354c6775058a27ecf6a2979d Apr 16 22:23:40.362862 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.362832 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:23:40.362978 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.362886 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:23:40.418224 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.418192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t7vb\" (UniqueName: \"kubernetes.io/projected/809f7b53-a3bd-47fe-a1f9-a4205275567f-kube-api-access-8t7vb\") pod \"authorino-674b59b84c-vcltb\" (UID: \"809f7b53-a3bd-47fe-a1f9-a4205275567f\") " pod="kuadrant-system/authorino-674b59b84c-vcltb" Apr 16 22:23:40.443731 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.443699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" event={"ID":"891f6d15-dc70-46a5-b577-177289b5f30d","Type":"ContainerStarted","Data":"0d62b82cd5f86dd985eb0ed75bf453e8f551831f354c6775058a27ecf6a2979d"} Apr 16 22:23:40.519378 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.519278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t7vb\" (UniqueName: \"kubernetes.io/projected/809f7b53-a3bd-47fe-a1f9-a4205275567f-kube-api-access-8t7vb\") pod \"authorino-674b59b84c-vcltb\" (UID: \"809f7b53-a3bd-47fe-a1f9-a4205275567f\") " pod="kuadrant-system/authorino-674b59b84c-vcltb" Apr 16 22:23:40.531703 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.531672 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t7vb\" (UniqueName: \"kubernetes.io/projected/809f7b53-a3bd-47fe-a1f9-a4205275567f-kube-api-access-8t7vb\") pod \"authorino-674b59b84c-vcltb\" (UID: \"809f7b53-a3bd-47fe-a1f9-a4205275567f\") " pod="kuadrant-system/authorino-674b59b84c-vcltb" Apr 16 22:23:40.602973 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.602941 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-vcltb" Apr 16 22:23:40.729301 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:40.729273 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-vcltb"] Apr 16 22:23:40.731441 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:23:40.731413 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod809f7b53_a3bd_47fe_a1f9_a4205275567f.slice/crio-0afcdeb1a93cae65d0833a2a6b2fa263ebac3e1e3f829bbabc3d6101fa3b6543 WatchSource:0}: Error finding container 0afcdeb1a93cae65d0833a2a6b2fa263ebac3e1e3f829bbabc3d6101fa3b6543: Status 404 returned error can't find the container with id 0afcdeb1a93cae65d0833a2a6b2fa263ebac3e1e3f829bbabc3d6101fa3b6543 Apr 16 22:23:41.445582 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:41.445496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-vcltb" event={"ID":"809f7b53-a3bd-47fe-a1f9-a4205275567f","Type":"ContainerStarted","Data":"0afcdeb1a93cae65d0833a2a6b2fa263ebac3e1e3f829bbabc3d6101fa3b6543"} Apr 16 22:23:41.447344 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:41.447311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" event={"ID":"891f6d15-dc70-46a5-b577-177289b5f30d","Type":"ContainerStarted","Data":"7309c745538024b85a37bb1f211a9d1cf7c633354ef73f34052616e089298381"} Apr 16 22:23:41.447469 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:41.447451 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:23:41.464948 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:41.464887 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" podStartSLOduration=1.637429598 podStartE2EDuration="2.464867187s" podCreationTimestamp="2026-04-16 22:23:39 +0000 UTC" firstStartedPulling="2026-04-16 22:23:40.35665042 +0000 UTC m=+600.581816791" lastFinishedPulling="2026-04-16 22:23:41.184088012 +0000 UTC m=+601.409254380" observedRunningTime="2026-04-16 22:23:41.463550427 +0000 UTC m=+601.688716815" watchObservedRunningTime="2026-04-16 22:23:41.464867187 +0000 UTC m=+601.690033568" Apr 16 22:23:43.461490 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:43.461383 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-vcltb" event={"ID":"809f7b53-a3bd-47fe-a1f9-a4205275567f","Type":"ContainerStarted","Data":"8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a"} Apr 16 22:23:43.476514 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:43.476455 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-vcltb" podStartSLOduration=1.05766165 podStartE2EDuration="3.47644034s" podCreationTimestamp="2026-04-16 22:23:40 +0000 UTC" firstStartedPulling="2026-04-16 22:23:40.732754413 +0000 UTC m=+600.957920770" lastFinishedPulling="2026-04-16 22:23:43.151533101 +0000 UTC m=+603.376699460" observedRunningTime="2026-04-16 22:23:43.475278274 +0000 UTC m=+603.700444655" watchObservedRunningTime="2026-04-16 22:23:43.47644034 +0000 UTC m=+603.701606719" Apr 16 22:23:46.190577 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.190540 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-vcltb"] Apr 16 22:23:46.191209 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.190725 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-vcltb" podUID="809f7b53-a3bd-47fe-a1f9-a4205275567f" containerName="authorino" containerID="cri-o://8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a" gracePeriod=30 Apr 16 22:23:46.438381 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.438358 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-vcltb" Apr 16 22:23:46.472751 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.472673 2572 generic.go:358] "Generic (PLEG): container finished" podID="809f7b53-a3bd-47fe-a1f9-a4205275567f" containerID="8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a" exitCode=0 Apr 16 22:23:46.472751 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.472711 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-vcltb" event={"ID":"809f7b53-a3bd-47fe-a1f9-a4205275567f","Type":"ContainerDied","Data":"8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a"} Apr 16 22:23:46.472751 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.472732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-vcltb" event={"ID":"809f7b53-a3bd-47fe-a1f9-a4205275567f","Type":"ContainerDied","Data":"0afcdeb1a93cae65d0833a2a6b2fa263ebac3e1e3f829bbabc3d6101fa3b6543"} Apr 16 22:23:46.472751 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.472736 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-vcltb" Apr 16 22:23:46.473026 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.472748 2572 scope.go:117] "RemoveContainer" containerID="8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a" Apr 16 22:23:46.481010 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.480985 2572 scope.go:117] "RemoveContainer" containerID="8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a" Apr 16 22:23:46.481375 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:23:46.481357 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a\": container with ID starting with 8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a not found: ID does not exist" containerID="8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a" Apr 16 22:23:46.481436 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.481384 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a"} err="failed to get container status \"8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a\": rpc error: code = NotFound desc = could not find container \"8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a\": container with ID starting with 8642f5eebb3fa3ff8601054205cf379df0d658ce9553c59a926c9be9b678d57a not found: ID does not exist" Apr 16 22:23:46.572364 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.572330 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t7vb\" (UniqueName: \"kubernetes.io/projected/809f7b53-a3bd-47fe-a1f9-a4205275567f-kube-api-access-8t7vb\") pod \"809f7b53-a3bd-47fe-a1f9-a4205275567f\" (UID: \"809f7b53-a3bd-47fe-a1f9-a4205275567f\") " Apr 16 22:23:46.574643 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.574621 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/809f7b53-a3bd-47fe-a1f9-a4205275567f-kube-api-access-8t7vb" (OuterVolumeSpecName: "kube-api-access-8t7vb") pod "809f7b53-a3bd-47fe-a1f9-a4205275567f" (UID: "809f7b53-a3bd-47fe-a1f9-a4205275567f"). InnerVolumeSpecName "kube-api-access-8t7vb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:23:46.673921 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.673888 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8t7vb\" (UniqueName: \"kubernetes.io/projected/809f7b53-a3bd-47fe-a1f9-a4205275567f-kube-api-access-8t7vb\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:23:46.793563 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.793533 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-vcltb"] Apr 16 22:23:46.798649 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:46.798624 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-vcltb"] Apr 16 22:23:48.444977 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:48.444879 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="809f7b53-a3bd-47fe-a1f9-a4205275567f" path="/var/lib/kubelet/pods/809f7b53-a3bd-47fe-a1f9-a4205275567f/volumes" Apr 16 22:23:52.458328 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:23:52.458296 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-kpr2g" Apr 16 22:25:48.526326 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.526286 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-x2gnk"] Apr 16 22:25:48.526803 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.526639 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="809f7b53-a3bd-47fe-a1f9-a4205275567f" containerName="authorino" Apr 16 22:25:48.526803 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.526651 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="809f7b53-a3bd-47fe-a1f9-a4205275567f" containerName="authorino" Apr 16 22:25:48.526803 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.526708 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="809f7b53-a3bd-47fe-a1f9-a4205275567f" containerName="authorino" Apr 16 22:25:48.528686 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.528670 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x2gnk" Apr 16 22:25:48.530738 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.530710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pd85\" (UniqueName: \"kubernetes.io/projected/5ccfdb52-5fff-4aed-9172-fb9f35a8dd34-kube-api-access-6pd85\") pod \"s3-init-x2gnk\" (UID: \"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34\") " pod="kserve/s3-init-x2gnk" Apr 16 22:25:48.531507 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.531486 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lfzqp\"" Apr 16 22:25:48.531507 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.531500 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:25:48.531665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.531507 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 22:25:48.531665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.531497 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:25:48.534809 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.534784 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-x2gnk"] Apr 16 22:25:48.631526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.631475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pd85\" (UniqueName: \"kubernetes.io/projected/5ccfdb52-5fff-4aed-9172-fb9f35a8dd34-kube-api-access-6pd85\") pod \"s3-init-x2gnk\" (UID: \"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34\") " pod="kserve/s3-init-x2gnk" Apr 16 22:25:48.639974 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.639933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pd85\" (UniqueName: \"kubernetes.io/projected/5ccfdb52-5fff-4aed-9172-fb9f35a8dd34-kube-api-access-6pd85\") pod \"s3-init-x2gnk\" (UID: \"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34\") " pod="kserve/s3-init-x2gnk" Apr 16 22:25:48.838742 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.838699 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x2gnk" Apr 16 22:25:48.965146 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.965121 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-x2gnk"] Apr 16 22:25:48.966849 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:25:48.966815 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ccfdb52_5fff_4aed_9172_fb9f35a8dd34.slice/crio-e7c69b76523a4e8993323fba3a8ba55f3048c10c23556a94989fd13932d308e9 WatchSource:0}: Error finding container e7c69b76523a4e8993323fba3a8ba55f3048c10c23556a94989fd13932d308e9: Status 404 returned error can't find the container with id e7c69b76523a4e8993323fba3a8ba55f3048c10c23556a94989fd13932d308e9 Apr 16 22:25:48.968524 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:48.968502 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:25:49.890052 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:49.890014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x2gnk" event={"ID":"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34","Type":"ContainerStarted","Data":"e7c69b76523a4e8993323fba3a8ba55f3048c10c23556a94989fd13932d308e9"} Apr 16 22:25:53.912953 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:53.912913 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x2gnk" event={"ID":"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34","Type":"ContainerStarted","Data":"8b5649488dbd90e86a053a3d897c48d7e913315f65d1b2c9d0ac7cd3b15c5d1b"} Apr 16 22:25:53.927816 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:53.927753 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-x2gnk" podStartSLOduration=1.53819974 podStartE2EDuration="5.927734787s" podCreationTimestamp="2026-04-16 22:25:48 +0000 UTC" firstStartedPulling="2026-04-16 22:25:48.968636653 +0000 UTC m=+729.193803011" lastFinishedPulling="2026-04-16 22:25:53.358171697 +0000 UTC m=+733.583338058" observedRunningTime="2026-04-16 22:25:53.92735969 +0000 UTC m=+734.152526096" watchObservedRunningTime="2026-04-16 22:25:53.927734787 +0000 UTC m=+734.152901176" Apr 16 22:25:56.923807 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:56.923773 2572 generic.go:358] "Generic (PLEG): container finished" podID="5ccfdb52-5fff-4aed-9172-fb9f35a8dd34" containerID="8b5649488dbd90e86a053a3d897c48d7e913315f65d1b2c9d0ac7cd3b15c5d1b" exitCode=0 Apr 16 22:25:56.923807 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:56.923810 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x2gnk" event={"ID":"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34","Type":"ContainerDied","Data":"8b5649488dbd90e86a053a3d897c48d7e913315f65d1b2c9d0ac7cd3b15c5d1b"} Apr 16 22:25:58.058834 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:58.058809 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x2gnk" Apr 16 22:25:58.105545 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:58.105510 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pd85\" (UniqueName: \"kubernetes.io/projected/5ccfdb52-5fff-4aed-9172-fb9f35a8dd34-kube-api-access-6pd85\") pod \"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34\" (UID: \"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34\") " Apr 16 22:25:58.107708 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:58.107673 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ccfdb52-5fff-4aed-9172-fb9f35a8dd34-kube-api-access-6pd85" (OuterVolumeSpecName: "kube-api-access-6pd85") pod "5ccfdb52-5fff-4aed-9172-fb9f35a8dd34" (UID: "5ccfdb52-5fff-4aed-9172-fb9f35a8dd34"). InnerVolumeSpecName "kube-api-access-6pd85". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:25:58.206974 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:58.206885 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6pd85\" (UniqueName: \"kubernetes.io/projected/5ccfdb52-5fff-4aed-9172-fb9f35a8dd34-kube-api-access-6pd85\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:25:58.931699 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:58.931658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-x2gnk" event={"ID":"5ccfdb52-5fff-4aed-9172-fb9f35a8dd34","Type":"ContainerDied","Data":"e7c69b76523a4e8993323fba3a8ba55f3048c10c23556a94989fd13932d308e9"} Apr 16 22:25:58.931699 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:58.931682 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-x2gnk" Apr 16 22:25:58.931699 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:25:58.931697 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c69b76523a4e8993323fba3a8ba55f3048c10c23556a94989fd13932d308e9" Apr 16 22:26:32.672435 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.672398 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr"] Apr 16 22:26:32.672896 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.672754 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ccfdb52-5fff-4aed-9172-fb9f35a8dd34" containerName="s3-init" Apr 16 22:26:32.672896 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.672766 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccfdb52-5fff-4aed-9172-fb9f35a8dd34" containerName="s3-init" Apr 16 22:26:32.672896 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.672828 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ccfdb52-5fff-4aed-9172-fb9f35a8dd34" containerName="s3-init" Apr 16 22:26:32.708277 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.708234 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr"] Apr 16 22:26:32.708448 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.708403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.711350 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.711316 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:26:32.711475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.711379 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 22:26:32.711475 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.711462 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:26:32.711556 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.711475 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:26:32.795173 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.795136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.795173 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.795176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.795395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.795281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.795395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.795302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.795395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.795346 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.795395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.795370 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnqx\" (UniqueName: \"kubernetes.io/projected/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kube-api-access-psnqx\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896248 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896445 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896445 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896445 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896614 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psnqx\" (UniqueName: \"kubernetes.io/projected/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kube-api-access-psnqx\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896614 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.896848 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.896832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.898605 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.898581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.899052 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.899036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:32.904343 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:32.904323 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnqx\" (UniqueName: \"kubernetes.io/projected/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kube-api-access-psnqx\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:33.018619 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:33.018526 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:33.149656 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:33.149628 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr"] Apr 16 22:26:33.151318 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:26:33.151290 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf17a5d06_a261_40cc_91d3_ff0ed3de4cde.slice/crio-55544a57cd43c18fb31be6b262f7eba5087f69b299894138fcba39b96902b5d7 WatchSource:0}: Error finding container 55544a57cd43c18fb31be6b262f7eba5087f69b299894138fcba39b96902b5d7: Status 404 returned error can't find the container with id 55544a57cd43c18fb31be6b262f7eba5087f69b299894138fcba39b96902b5d7 Apr 16 22:26:34.049716 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:34.049672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" event={"ID":"f17a5d06-a261-40cc-91d3-ff0ed3de4cde","Type":"ContainerStarted","Data":"55544a57cd43c18fb31be6b262f7eba5087f69b299894138fcba39b96902b5d7"} Apr 16 22:26:37.061681 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:37.061630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" event={"ID":"f17a5d06-a261-40cc-91d3-ff0ed3de4cde","Type":"ContainerStarted","Data":"52eb4c04d849afce4cebc150f9e942f71026685127945c63ed91d05eff8b2a33"} Apr 16 22:26:41.078028 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:41.077934 2572 generic.go:358] "Generic (PLEG): container finished" podID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" containerID="52eb4c04d849afce4cebc150f9e942f71026685127945c63ed91d05eff8b2a33" exitCode=0 Apr 16 22:26:41.078028 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:41.077990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" event={"ID":"f17a5d06-a261-40cc-91d3-ff0ed3de4cde","Type":"ContainerDied","Data":"52eb4c04d849afce4cebc150f9e942f71026685127945c63ed91d05eff8b2a33"} Apr 16 22:26:43.085283 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:43.085244 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" event={"ID":"f17a5d06-a261-40cc-91d3-ff0ed3de4cde","Type":"ContainerStarted","Data":"63bb01aa09212db02b16edb69367140bab2ce7b375c7a5f4647eab063cc0487f"} Apr 16 22:26:43.103391 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:43.103336 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" podStartSLOduration=2.140966411 podStartE2EDuration="11.103318732s" podCreationTimestamp="2026-04-16 22:26:32 +0000 UTC" firstStartedPulling="2026-04-16 22:26:33.153246383 +0000 UTC m=+773.378412741" lastFinishedPulling="2026-04-16 22:26:42.115598692 +0000 UTC m=+782.340765062" observedRunningTime="2026-04-16 22:26:43.101783622 +0000 UTC m=+783.326950029" watchObservedRunningTime="2026-04-16 22:26:43.103318732 +0000 UTC m=+783.328485112" Apr 16 22:26:53.019483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:53.019394 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:53.019483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:53.019442 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:53.031958 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:53.031926 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:26:53.130066 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:26:53.130032 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:27:19.551519 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.551485 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7"] Apr 16 22:27:19.571395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.571365 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7"] Apr 16 22:27:19.571554 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.571486 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.574047 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.574022 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 22:27:19.702578 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.702541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b94d77a3-f092-49cc-9e01-8f124c269922-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.702789 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.702585 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.702789 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.702705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4n57\" (UniqueName: \"kubernetes.io/projected/b94d77a3-f092-49cc-9e01-8f124c269922-kube-api-access-f4n57\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.702789 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.702758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-home\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.702928 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.702799 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-dshm\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.702928 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.702828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-model-cache\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.803442 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.803347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b94d77a3-f092-49cc-9e01-8f124c269922-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.803442 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.803398 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.803659 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.803470 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4n57\" (UniqueName: \"kubernetes.io/projected/b94d77a3-f092-49cc-9e01-8f124c269922-kube-api-access-f4n57\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.803659 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.803507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-home\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.803659 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.803546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-dshm\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.803659 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.803571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-model-cache\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.803889 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.803859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.804070 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.804049 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-home\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.804169 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.804068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-model-cache\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.805847 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.805824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-dshm\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.805998 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.805982 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b94d77a3-f092-49cc-9e01-8f124c269922-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.813482 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.813457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4n57\" (UniqueName: \"kubernetes.io/projected/b94d77a3-f092-49cc-9e01-8f124c269922-kube-api-access-f4n57\") pod \"scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:19.882449 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:19.882415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:20.012516 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:20.012488 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7"] Apr 16 22:27:20.015204 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:27:20.015179 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb94d77a3_f092_49cc_9e01_8f124c269922.slice/crio-eeb68626e85632b01d53d55a921f113056fdf45380f85c38b67be9a263e22981 WatchSource:0}: Error finding container eeb68626e85632b01d53d55a921f113056fdf45380f85c38b67be9a263e22981: Status 404 returned error can't find the container with id eeb68626e85632b01d53d55a921f113056fdf45380f85c38b67be9a263e22981 Apr 16 22:27:20.212984 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:20.212952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" event={"ID":"b94d77a3-f092-49cc-9e01-8f124c269922","Type":"ContainerStarted","Data":"1d817b4329e70ab097214059fbc0be080b60414004c20ea2fc5301d9919bb3a1"} Apr 16 22:27:20.213177 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:20.212991 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" event={"ID":"b94d77a3-f092-49cc-9e01-8f124c269922","Type":"ContainerStarted","Data":"eeb68626e85632b01d53d55a921f113056fdf45380f85c38b67be9a263e22981"} Apr 16 22:27:24.227035 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:24.226995 2572 generic.go:358] "Generic (PLEG): container finished" podID="b94d77a3-f092-49cc-9e01-8f124c269922" containerID="1d817b4329e70ab097214059fbc0be080b60414004c20ea2fc5301d9919bb3a1" exitCode=0 Apr 16 22:27:24.227407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:24.227065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" event={"ID":"b94d77a3-f092-49cc-9e01-8f124c269922","Type":"ContainerDied","Data":"1d817b4329e70ab097214059fbc0be080b60414004c20ea2fc5301d9919bb3a1"} Apr 16 22:27:25.232421 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:25.232380 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" event={"ID":"b94d77a3-f092-49cc-9e01-8f124c269922","Type":"ContainerStarted","Data":"5796dc0c5662ecf6549d04f92dd6f12bbafd0b772f48dd9ce3ec64634d3c727b"} Apr 16 22:27:25.251372 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:25.251321 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" podStartSLOduration=6.251307343 podStartE2EDuration="6.251307343s" podCreationTimestamp="2026-04-16 22:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:27:25.249475987 +0000 UTC m=+825.474642367" watchObservedRunningTime="2026-04-16 22:27:25.251307343 +0000 UTC m=+825.476473722" Apr 16 22:27:29.883628 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:29.883589 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:29.883628 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:29.883626 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:29.895905 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:29.895870 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:30.260283 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:30.260193 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:27:36.039306 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.039268 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr"] Apr 16 22:27:36.039707 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.039598 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" podUID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" containerName="main" containerID="cri-o://63bb01aa09212db02b16edb69367140bab2ce7b375c7a5f4647eab063cc0487f" gracePeriod=30 Apr 16 22:27:36.271382 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.271347 2572 generic.go:358] "Generic (PLEG): container finished" podID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" containerID="63bb01aa09212db02b16edb69367140bab2ce7b375c7a5f4647eab063cc0487f" exitCode=0 Apr 16 22:27:36.271539 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.271424 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" event={"ID":"f17a5d06-a261-40cc-91d3-ff0ed3de4cde","Type":"ContainerDied","Data":"63bb01aa09212db02b16edb69367140bab2ce7b375c7a5f4647eab063cc0487f"} Apr 16 22:27:36.298607 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.298538 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:27:36.345620 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.345585 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psnqx\" (UniqueName: \"kubernetes.io/projected/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kube-api-access-psnqx\") pod \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " Apr 16 22:27:36.345620 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.345627 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-model-cache\") pod \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " Apr 16 22:27:36.345840 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.345676 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kserve-provision-location\") pod \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " Apr 16 22:27:36.345840 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.345712 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-tls-certs\") pod \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " Apr 16 22:27:36.345840 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.345802 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-home\") pod \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " Apr 16 22:27:36.345840 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.345832 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-dshm\") pod \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\" (UID: \"f17a5d06-a261-40cc-91d3-ff0ed3de4cde\") " Apr 16 22:27:36.346046 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.345966 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-model-cache" (OuterVolumeSpecName: "model-cache") pod "f17a5d06-a261-40cc-91d3-ff0ed3de4cde" (UID: "f17a5d06-a261-40cc-91d3-ff0ed3de4cde"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:36.346128 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.346059 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-home" (OuterVolumeSpecName: "home") pod "f17a5d06-a261-40cc-91d3-ff0ed3de4cde" (UID: "f17a5d06-a261-40cc-91d3-ff0ed3de4cde"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:36.346246 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.346225 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:27:36.346326 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.346251 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:27:36.347983 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.347954 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-dshm" (OuterVolumeSpecName: "dshm") pod "f17a5d06-a261-40cc-91d3-ff0ed3de4cde" (UID: "f17a5d06-a261-40cc-91d3-ff0ed3de4cde"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:36.347983 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.347970 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f17a5d06-a261-40cc-91d3-ff0ed3de4cde" (UID: "f17a5d06-a261-40cc-91d3-ff0ed3de4cde"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:27:36.348134 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.347961 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kube-api-access-psnqx" (OuterVolumeSpecName: "kube-api-access-psnqx") pod "f17a5d06-a261-40cc-91d3-ff0ed3de4cde" (UID: "f17a5d06-a261-40cc-91d3-ff0ed3de4cde"). InnerVolumeSpecName "kube-api-access-psnqx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:27:36.399641 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.399573 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f17a5d06-a261-40cc-91d3-ff0ed3de4cde" (UID: "f17a5d06-a261-40cc-91d3-ff0ed3de4cde"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:36.446667 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.446635 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:27:36.446667 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.446661 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-psnqx\" (UniqueName: \"kubernetes.io/projected/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kube-api-access-psnqx\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:27:36.446667 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.446673 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:27:36.446892 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:36.446682 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f17a5d06-a261-40cc-91d3-ff0ed3de4cde-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:27:37.276648 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:37.276616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" event={"ID":"f17a5d06-a261-40cc-91d3-ff0ed3de4cde","Type":"ContainerDied","Data":"55544a57cd43c18fb31be6b262f7eba5087f69b299894138fcba39b96902b5d7"} Apr 16 22:27:37.277045 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:37.276666 2572 scope.go:117] "RemoveContainer" containerID="63bb01aa09212db02b16edb69367140bab2ce7b375c7a5f4647eab063cc0487f" Apr 16 22:27:37.277045 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:37.276667 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr" Apr 16 22:27:37.285219 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:37.285196 2572 scope.go:117] "RemoveContainer" containerID="52eb4c04d849afce4cebc150f9e942f71026685127945c63ed91d05eff8b2a33" Apr 16 22:27:37.297566 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:37.297538 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr"] Apr 16 22:27:37.301962 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:37.301935 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-58fcd669f9q4ntr"] Apr 16 22:27:38.444727 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:38.444691 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" path="/var/lib/kubelet/pods/f17a5d06-a261-40cc-91d3-ff0ed3de4cde/volumes" Apr 16 22:27:38.973635 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:38.973596 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt"] Apr 16 22:27:38.974220 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:38.974200 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" containerName="main" Apr 16 22:27:38.974278 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:38.974224 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" containerName="main" Apr 16 22:27:38.974278 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:38.974266 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" containerName="storage-initializer" Apr 16 22:27:38.974278 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:38.974276 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" containerName="storage-initializer" Apr 16 22:27:38.974408 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:38.974395 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f17a5d06-a261-40cc-91d3-ff0ed3de4cde" containerName="main" Apr 16 22:27:39.018741 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.018697 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt"] Apr 16 22:27:39.018899 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.018865 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.021575 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.021544 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 22:27:39.070172 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.070127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.070365 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.070176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq95m\" (UniqueName: \"kubernetes.io/projected/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kube-api-access-xq95m\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.070365 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.070214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6c33c4b-eb25-427e-9928-1b7e11ec8960-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.070365 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.070264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.070365 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.070305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.070365 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.070326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.170748 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.170708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.170748 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.170761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.171009 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.170779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.171009 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.170836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.171009 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.170856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq95m\" (UniqueName: \"kubernetes.io/projected/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kube-api-access-xq95m\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.171009 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.170878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6c33c4b-eb25-427e-9928-1b7e11ec8960-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.171285 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.171259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.171285 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.171275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.171400 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.171358 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.173143 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.173119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.173472 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.173454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6c33c4b-eb25-427e-9928-1b7e11ec8960-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.180660 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.180628 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq95m\" (UniqueName: \"kubernetes.io/projected/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kube-api-access-xq95m\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.330235 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.330193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:27:39.459712 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:39.459675 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt"] Apr 16 22:27:39.463479 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:27:39.463445 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c33c4b_eb25_427e_9928_1b7e11ec8960.slice/crio-3e1506a88d0def5fc41e866d14a981341d459db1b0c24190a9245279474eb8cd WatchSource:0}: Error finding container 3e1506a88d0def5fc41e866d14a981341d459db1b0c24190a9245279474eb8cd: Status 404 returned error can't find the container with id 3e1506a88d0def5fc41e866d14a981341d459db1b0c24190a9245279474eb8cd Apr 16 22:27:40.290924 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:40.290883 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" event={"ID":"f6c33c4b-eb25-427e-9928-1b7e11ec8960","Type":"ContainerStarted","Data":"dd253587c736978fbcb5c9df91c376d1fda33c5bccfbfe83be888eef5b7fa1e9"} Apr 16 22:27:40.290924 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:40.290923 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" event={"ID":"f6c33c4b-eb25-427e-9928-1b7e11ec8960","Type":"ContainerStarted","Data":"3e1506a88d0def5fc41e866d14a981341d459db1b0c24190a9245279474eb8cd"} Apr 16 22:27:44.305787 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:44.305750 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerID="dd253587c736978fbcb5c9df91c376d1fda33c5bccfbfe83be888eef5b7fa1e9" exitCode=0 Apr 16 22:27:44.306214 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:27:44.305822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" event={"ID":"f6c33c4b-eb25-427e-9928-1b7e11ec8960","Type":"ContainerDied","Data":"dd253587c736978fbcb5c9df91c376d1fda33c5bccfbfe83be888eef5b7fa1e9"} Apr 16 22:28:02.944063 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:02.944021 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7"] Apr 16 22:28:02.944681 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:02.944407 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" podUID="b94d77a3-f092-49cc-9e01-8f124c269922" containerName="main" containerID="cri-o://5796dc0c5662ecf6549d04f92dd6f12bbafd0b772f48dd9ce3ec64634d3c727b" gracePeriod=30 Apr 16 22:28:03.394345 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.394307 2572 generic.go:358] "Generic (PLEG): container finished" podID="b94d77a3-f092-49cc-9e01-8f124c269922" containerID="5796dc0c5662ecf6549d04f92dd6f12bbafd0b772f48dd9ce3ec64634d3c727b" exitCode=0 Apr 16 22:28:03.394521 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.394370 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" event={"ID":"b94d77a3-f092-49cc-9e01-8f124c269922","Type":"ContainerDied","Data":"5796dc0c5662ecf6549d04f92dd6f12bbafd0b772f48dd9ce3ec64634d3c727b"} Apr 16 22:28:03.417333 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.417303 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:28:03.501960 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.501924 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-dshm\") pod \"b94d77a3-f092-49cc-9e01-8f124c269922\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " Apr 16 22:28:03.502177 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.501991 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-home\") pod \"b94d77a3-f092-49cc-9e01-8f124c269922\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " Apr 16 22:28:03.502177 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.502045 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4n57\" (UniqueName: \"kubernetes.io/projected/b94d77a3-f092-49cc-9e01-8f124c269922-kube-api-access-f4n57\") pod \"b94d77a3-f092-49cc-9e01-8f124c269922\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " Apr 16 22:28:03.502177 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.502080 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b94d77a3-f092-49cc-9e01-8f124c269922-tls-certs\") pod \"b94d77a3-f092-49cc-9e01-8f124c269922\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " Apr 16 22:28:03.502177 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.502147 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-model-cache\") pod \"b94d77a3-f092-49cc-9e01-8f124c269922\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " Apr 16 22:28:03.502399 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.502197 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-kserve-provision-location\") pod \"b94d77a3-f092-49cc-9e01-8f124c269922\" (UID: \"b94d77a3-f092-49cc-9e01-8f124c269922\") " Apr 16 22:28:03.502399 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.502269 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-home" (OuterVolumeSpecName: "home") pod "b94d77a3-f092-49cc-9e01-8f124c269922" (UID: "b94d77a3-f092-49cc-9e01-8f124c269922"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:03.502504 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.502439 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-model-cache" (OuterVolumeSpecName: "model-cache") pod "b94d77a3-f092-49cc-9e01-8f124c269922" (UID: "b94d77a3-f092-49cc-9e01-8f124c269922"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:03.502555 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.502507 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:03.504279 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.504211 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-dshm" (OuterVolumeSpecName: "dshm") pod "b94d77a3-f092-49cc-9e01-8f124c269922" (UID: "b94d77a3-f092-49cc-9e01-8f124c269922"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:03.504279 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.504214 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b94d77a3-f092-49cc-9e01-8f124c269922-kube-api-access-f4n57" (OuterVolumeSpecName: "kube-api-access-f4n57") pod "b94d77a3-f092-49cc-9e01-8f124c269922" (UID: "b94d77a3-f092-49cc-9e01-8f124c269922"). InnerVolumeSpecName "kube-api-access-f4n57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:28:03.504409 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.504278 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b94d77a3-f092-49cc-9e01-8f124c269922-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b94d77a3-f092-49cc-9e01-8f124c269922" (UID: "b94d77a3-f092-49cc-9e01-8f124c269922"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:28:03.562264 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.562213 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b94d77a3-f092-49cc-9e01-8f124c269922" (UID: "b94d77a3-f092-49cc-9e01-8f124c269922"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:03.603960 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.603911 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:03.603960 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.603949 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:03.603960 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.603966 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b94d77a3-f092-49cc-9e01-8f124c269922-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:03.604285 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.603976 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4n57\" (UniqueName: \"kubernetes.io/projected/b94d77a3-f092-49cc-9e01-8f124c269922-kube-api-access-f4n57\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:03.604285 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:03.603986 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b94d77a3-f092-49cc-9e01-8f124c269922-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.399462 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:04.399434 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" Apr 16 22:28:04.399893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:04.399433 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7" event={"ID":"b94d77a3-f092-49cc-9e01-8f124c269922","Type":"ContainerDied","Data":"eeb68626e85632b01d53d55a921f113056fdf45380f85c38b67be9a263e22981"} Apr 16 22:28:04.399893 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:04.399552 2572 scope.go:117] "RemoveContainer" containerID="5796dc0c5662ecf6549d04f92dd6f12bbafd0b772f48dd9ce3ec64634d3c727b" Apr 16 22:28:04.410099 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:04.410071 2572 scope.go:117] "RemoveContainer" containerID="1d817b4329e70ab097214059fbc0be080b60414004c20ea2fc5301d9919bb3a1" Apr 16 22:28:04.423943 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:04.423905 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7"] Apr 16 22:28:04.428427 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:04.428393 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-5fcb55f6dc-qvbl7"] Apr 16 22:28:04.446079 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:04.446038 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b94d77a3-f092-49cc-9e01-8f124c269922" path="/var/lib/kubelet/pods/b94d77a3-f092-49cc-9e01-8f124c269922/volumes" Apr 16 22:28:08.895156 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.895118 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8"] Apr 16 22:28:08.895545 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.895529 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b94d77a3-f092-49cc-9e01-8f124c269922" containerName="storage-initializer" Apr 16 22:28:08.895587 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.895548 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94d77a3-f092-49cc-9e01-8f124c269922" containerName="storage-initializer" Apr 16 22:28:08.895587 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.895565 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b94d77a3-f092-49cc-9e01-8f124c269922" containerName="main" Apr 16 22:28:08.895587 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.895571 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b94d77a3-f092-49cc-9e01-8f124c269922" containerName="main" Apr 16 22:28:08.895679 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.895634 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b94d77a3-f092-49cc-9e01-8f124c269922" containerName="main" Apr 16 22:28:08.906450 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.906415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:08.906608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.906515 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8"] Apr 16 22:28:08.909549 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:08.909522 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 22:28:09.061368 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.061324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-home\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.061574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.061386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.061574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.061409 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8gdl\" (UniqueName: \"kubernetes.io/projected/10be46f4-8a63-46fe-980d-2217bea7e537-kube-api-access-s8gdl\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.061574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.061462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10be46f4-8a63-46fe-980d-2217bea7e537-tls-certs\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.061574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.061502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-dshm\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.061574 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.061548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-model-cache\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.162611 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.162522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8gdl\" (UniqueName: \"kubernetes.io/projected/10be46f4-8a63-46fe-980d-2217bea7e537-kube-api-access-s8gdl\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.162611 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.162571 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10be46f4-8a63-46fe-980d-2217bea7e537-tls-certs\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.162611 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.162608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-dshm\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.162898 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.162652 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-model-cache\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.162898 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.162674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-home\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.162898 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.162711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.163161 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.163141 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-model-cache\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.163227 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.163160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.163280 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.163231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-home\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.165048 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.165018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-dshm\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.165248 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.165231 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10be46f4-8a63-46fe-980d-2217bea7e537-tls-certs\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.171228 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.171207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8gdl\" (UniqueName: \"kubernetes.io/projected/10be46f4-8a63-46fe-980d-2217bea7e537-kube-api-access-s8gdl\") pod \"precise-prefix-cache-test-kserve-97cdbff9-bpgr8\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.218167 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.218088 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:09.359083 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.359054 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8"] Apr 16 22:28:09.360942 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:28:09.360907 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10be46f4_8a63_46fe_980d_2217bea7e537.slice/crio-acf562d904ec7edce46a01ba8fea1f23fc262f6ae79583f405549c18a6317f78 WatchSource:0}: Error finding container acf562d904ec7edce46a01ba8fea1f23fc262f6ae79583f405549c18a6317f78: Status 404 returned error can't find the container with id acf562d904ec7edce46a01ba8fea1f23fc262f6ae79583f405549c18a6317f78 Apr 16 22:28:09.419842 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:09.419801 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" event={"ID":"10be46f4-8a63-46fe-980d-2217bea7e537","Type":"ContainerStarted","Data":"acf562d904ec7edce46a01ba8fea1f23fc262f6ae79583f405549c18a6317f78"} Apr 16 22:28:10.425869 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:10.425811 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" event={"ID":"10be46f4-8a63-46fe-980d-2217bea7e537","Type":"ContainerStarted","Data":"53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef"} Apr 16 22:28:14.442610 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:14.442520 2572 generic.go:358] "Generic (PLEG): container finished" podID="10be46f4-8a63-46fe-980d-2217bea7e537" containerID="53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef" exitCode=0 Apr 16 22:28:14.444121 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:14.444078 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" event={"ID":"10be46f4-8a63-46fe-980d-2217bea7e537","Type":"ContainerDied","Data":"53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef"} Apr 16 22:28:15.447572 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:15.447536 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" event={"ID":"10be46f4-8a63-46fe-980d-2217bea7e537","Type":"ContainerStarted","Data":"140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b"} Apr 16 22:28:15.467039 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:15.466984 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" podStartSLOduration=7.466966517 podStartE2EDuration="7.466966517s" podCreationTimestamp="2026-04-16 22:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:28:15.465017723 +0000 UTC m=+875.690184104" watchObservedRunningTime="2026-04-16 22:28:15.466966517 +0000 UTC m=+875.692132896" Apr 16 22:28:19.219070 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:19.218975 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:19.219070 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:19.219021 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:19.233274 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:19.233241 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:19.476087 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:19.476007 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:29.505389 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:29.505297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" event={"ID":"f6c33c4b-eb25-427e-9928-1b7e11ec8960","Type":"ContainerStarted","Data":"ad6ea101953be7b9e263fb15ade5c72f1093bc222a8664c1dc3eb403505b02fc"} Apr 16 22:28:29.526657 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:29.526593 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podStartSLOduration=6.6064112139999995 podStartE2EDuration="51.526574016s" podCreationTimestamp="2026-04-16 22:27:38 +0000 UTC" firstStartedPulling="2026-04-16 22:27:44.30692506 +0000 UTC m=+844.532091419" lastFinishedPulling="2026-04-16 22:28:29.227087864 +0000 UTC m=+889.452254221" observedRunningTime="2026-04-16 22:28:29.523443439 +0000 UTC m=+889.748609819" watchObservedRunningTime="2026-04-16 22:28:29.526574016 +0000 UTC m=+889.751740395" Apr 16 22:28:39.330890 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:39.330851 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:28:39.330890 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:39.330901 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:28:39.332448 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:39.332413 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:28:40.394572 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:40.394546 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:28:40.394984 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:40.394648 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:28:42.601137 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.601080 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8"] Apr 16 22:28:42.601668 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.601389 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" podUID="10be46f4-8a63-46fe-980d-2217bea7e537" containerName="main" containerID="cri-o://140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b" gracePeriod=30 Apr 16 22:28:42.869697 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.869670 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:42.985486 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.985452 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-kserve-provision-location\") pod \"10be46f4-8a63-46fe-980d-2217bea7e537\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " Apr 16 22:28:42.985678 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.985504 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-home\") pod \"10be46f4-8a63-46fe-980d-2217bea7e537\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " Apr 16 22:28:42.985678 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.985544 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-dshm\") pod \"10be46f4-8a63-46fe-980d-2217bea7e537\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " Apr 16 22:28:42.985678 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.985596 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10be46f4-8a63-46fe-980d-2217bea7e537-tls-certs\") pod \"10be46f4-8a63-46fe-980d-2217bea7e537\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " Apr 16 22:28:42.985678 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.985642 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-model-cache\") pod \"10be46f4-8a63-46fe-980d-2217bea7e537\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " Apr 16 22:28:42.985905 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.985709 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8gdl\" (UniqueName: \"kubernetes.io/projected/10be46f4-8a63-46fe-980d-2217bea7e537-kube-api-access-s8gdl\") pod \"10be46f4-8a63-46fe-980d-2217bea7e537\" (UID: \"10be46f4-8a63-46fe-980d-2217bea7e537\") " Apr 16 22:28:42.985905 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.985814 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-home" (OuterVolumeSpecName: "home") pod "10be46f4-8a63-46fe-980d-2217bea7e537" (UID: "10be46f4-8a63-46fe-980d-2217bea7e537"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:42.986028 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.985998 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-model-cache" (OuterVolumeSpecName: "model-cache") pod "10be46f4-8a63-46fe-980d-2217bea7e537" (UID: "10be46f4-8a63-46fe-980d-2217bea7e537"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:42.986358 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.986334 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:42.986358 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.986359 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:42.987944 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.987901 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10be46f4-8a63-46fe-980d-2217bea7e537-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "10be46f4-8a63-46fe-980d-2217bea7e537" (UID: "10be46f4-8a63-46fe-980d-2217bea7e537"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:28:42.988198 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.988170 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10be46f4-8a63-46fe-980d-2217bea7e537-kube-api-access-s8gdl" (OuterVolumeSpecName: "kube-api-access-s8gdl") pod "10be46f4-8a63-46fe-980d-2217bea7e537" (UID: "10be46f4-8a63-46fe-980d-2217bea7e537"). InnerVolumeSpecName "kube-api-access-s8gdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:28:42.988292 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:42.988215 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-dshm" (OuterVolumeSpecName: "dshm") pod "10be46f4-8a63-46fe-980d-2217bea7e537" (UID: "10be46f4-8a63-46fe-980d-2217bea7e537"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:43.041129 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.041068 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "10be46f4-8a63-46fe-980d-2217bea7e537" (UID: "10be46f4-8a63-46fe-980d-2217bea7e537"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:43.087370 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.087327 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:43.087469 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.087374 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/10be46f4-8a63-46fe-980d-2217bea7e537-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:43.087469 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.087391 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8gdl\" (UniqueName: \"kubernetes.io/projected/10be46f4-8a63-46fe-980d-2217bea7e537-kube-api-access-s8gdl\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:43.087469 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.087407 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10be46f4-8a63-46fe-980d-2217bea7e537-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:28:43.555908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.555867 2572 generic.go:358] "Generic (PLEG): container finished" podID="10be46f4-8a63-46fe-980d-2217bea7e537" containerID="140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b" exitCode=0 Apr 16 22:28:43.556086 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.555933 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" Apr 16 22:28:43.556086 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.555955 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" event={"ID":"10be46f4-8a63-46fe-980d-2217bea7e537","Type":"ContainerDied","Data":"140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b"} Apr 16 22:28:43.556086 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.555999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8" event={"ID":"10be46f4-8a63-46fe-980d-2217bea7e537","Type":"ContainerDied","Data":"acf562d904ec7edce46a01ba8fea1f23fc262f6ae79583f405549c18a6317f78"} Apr 16 22:28:43.556086 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.556015 2572 scope.go:117] "RemoveContainer" containerID="140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b" Apr 16 22:28:43.566314 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.566289 2572 scope.go:117] "RemoveContainer" containerID="53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef" Apr 16 22:28:43.580514 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.580475 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8"] Apr 16 22:28:43.582207 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.582163 2572 scope.go:117] "RemoveContainer" containerID="140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b" Apr 16 22:28:43.582772 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:28:43.582622 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b\": container with ID starting with 140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b not found: ID does not exist" containerID="140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b" Apr 16 22:28:43.582772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.582662 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b"} err="failed to get container status \"140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b\": rpc error: code = NotFound desc = could not find container \"140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b\": container with ID starting with 140eb67d65b24c2afa5c6b45c3e61de5b702f86a441360c9b35ab0e01956965b not found: ID does not exist" Apr 16 22:28:43.582772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.582687 2572 scope.go:117] "RemoveContainer" containerID="53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef" Apr 16 22:28:43.583051 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:28:43.583020 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef\": container with ID starting with 53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef not found: ID does not exist" containerID="53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef" Apr 16 22:28:43.583247 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.583060 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef"} err="failed to get container status \"53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef\": rpc error: code = NotFound desc = could not find container \"53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef\": container with ID starting with 53e9099ac16e4d213bf451fa4cf074c02473aec8857ba4317fed485a8b0e9cef not found: ID does not exist" Apr 16 22:28:43.585214 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:43.585190 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-bpgr8"] Apr 16 22:28:44.444599 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:44.444556 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10be46f4-8a63-46fe-980d-2217bea7e537" path="/var/lib/kubelet/pods/10be46f4-8a63-46fe-980d-2217bea7e537/volumes" Apr 16 22:28:49.331638 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:49.331595 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:28:54.882684 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.882647 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg"] Apr 16 22:28:54.883256 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.883237 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10be46f4-8a63-46fe-980d-2217bea7e537" containerName="main" Apr 16 22:28:54.883301 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.883260 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="10be46f4-8a63-46fe-980d-2217bea7e537" containerName="main" Apr 16 22:28:54.883301 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.883279 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10be46f4-8a63-46fe-980d-2217bea7e537" containerName="storage-initializer" Apr 16 22:28:54.883301 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.883288 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="10be46f4-8a63-46fe-980d-2217bea7e537" containerName="storage-initializer" Apr 16 22:28:54.883418 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.883389 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="10be46f4-8a63-46fe-980d-2217bea7e537" containerName="main" Apr 16 22:28:54.886870 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.886850 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:54.889998 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.889963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 22:28:54.896841 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.896809 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg"] Apr 16 22:28:54.902046 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.902012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv6lx\" (UniqueName: \"kubernetes.io/projected/622aa5dd-a4e0-4b4e-b249-713da19c6058-kube-api-access-lv6lx\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:54.902201 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.902061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-home\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:54.902201 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.902192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-model-cache\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:54.902325 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.902225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-dshm\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:54.902325 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.902299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/622aa5dd-a4e0-4b4e-b249-713da19c6058-tls-certs\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:54.902438 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:54.902336 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-kserve-provision-location\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003342 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv6lx\" (UniqueName: \"kubernetes.io/projected/622aa5dd-a4e0-4b4e-b249-713da19c6058-kube-api-access-lv6lx\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003342 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-home\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-model-cache\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-dshm\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003547 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/622aa5dd-a4e0-4b4e-b249-713da19c6058-tls-certs\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-kserve-provision-location\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003844 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003822 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-home\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003905 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-model-cache\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.003959 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.003915 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-kserve-provision-location\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.005883 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.005854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-dshm\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.006029 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.006010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/622aa5dd-a4e0-4b4e-b249-713da19c6058-tls-certs\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.012224 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.012190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv6lx\" (UniqueName: \"kubernetes.io/projected/622aa5dd-a4e0-4b4e-b249-713da19c6058-kube-api-access-lv6lx\") pod \"stop-feature-test-kserve-6b98876994-fdvmg\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.199797 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.199697 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:28:55.330562 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.330531 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg"] Apr 16 22:28:55.332945 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:28:55.332919 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod622aa5dd_a4e0_4b4e_b249_713da19c6058.slice/crio-8f95488ab39e2e3cd08967685deb94494a4c800bb54b868e1a59ebc02f9b52cb WatchSource:0}: Error finding container 8f95488ab39e2e3cd08967685deb94494a4c800bb54b868e1a59ebc02f9b52cb: Status 404 returned error can't find the container with id 8f95488ab39e2e3cd08967685deb94494a4c800bb54b868e1a59ebc02f9b52cb Apr 16 22:28:55.601269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.601225 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" event={"ID":"622aa5dd-a4e0-4b4e-b249-713da19c6058","Type":"ContainerStarted","Data":"1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4"} Apr 16 22:28:55.601269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:55.601274 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" event={"ID":"622aa5dd-a4e0-4b4e-b249-713da19c6058","Type":"ContainerStarted","Data":"8f95488ab39e2e3cd08967685deb94494a4c800bb54b868e1a59ebc02f9b52cb"} Apr 16 22:28:59.331432 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:59.331373 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:28:59.619646 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:59.619563 2572 generic.go:358] "Generic (PLEG): container finished" podID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerID="1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4" exitCode=0 Apr 16 22:28:59.619775 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:28:59.619639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" event={"ID":"622aa5dd-a4e0-4b4e-b249-713da19c6058","Type":"ContainerDied","Data":"1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4"} Apr 16 22:29:00.626286 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:00.626246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" event={"ID":"622aa5dd-a4e0-4b4e-b249-713da19c6058","Type":"ContainerStarted","Data":"4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e"} Apr 16 22:29:00.650006 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:00.649936 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podStartSLOduration=6.6499161 podStartE2EDuration="6.6499161s" podCreationTimestamp="2026-04-16 22:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:29:00.648009016 +0000 UTC m=+920.873175417" watchObservedRunningTime="2026-04-16 22:29:00.6499161 +0000 UTC m=+920.875082480" Apr 16 22:29:05.200783 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:05.200741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:29:05.201186 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:05.200801 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:29:05.202631 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:05.202602 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:29:09.331255 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:09.331204 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:29:15.200741 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:15.200686 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:29:19.331535 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:19.331493 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:29:25.200714 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:25.200670 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:29:29.330923 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:29.330860 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:29:35.200815 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:35.200766 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:29:39.331472 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:39.331429 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:29:45.200783 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:45.200732 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:29:49.330986 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:49.330900 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:29:55.200641 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:55.200595 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:29:59.331177 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:29:59.331122 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:30:05.200832 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:05.200779 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:30:09.331272 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:09.331224 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" probeResult="failure" output="Get \"https://10.134.0.33:8000/health\": dial tcp 10.134.0.33:8000: connect: connection refused" Apr 16 22:30:15.201086 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:15.201028 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:30:19.341021 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:19.340976 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:30:19.348630 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:19.348608 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:30:25.200966 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:25.200920 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:30:28.821913 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:28.821874 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt"] Apr 16 22:30:28.822415 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:28.822252 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" containerID="cri-o://ad6ea101953be7b9e263fb15ade5c72f1093bc222a8664c1dc3eb403505b02fc" gracePeriod=30 Apr 16 22:30:35.200615 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:35.200559 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:30:37.288286 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.288247 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6"] Apr 16 22:30:37.292931 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.292911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.296761 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.296731 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 22:30:37.312245 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.312211 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6"] Apr 16 22:30:37.387618 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.387568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-dshm\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.387785 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.387653 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-model-cache\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.387785 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.387709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvvz2\" (UniqueName: \"kubernetes.io/projected/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kube-api-access-vvvz2\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.387785 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.387767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-tls-certs\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.387892 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.387837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.387892 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.387875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-home\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489100 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-home\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489165 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-dshm\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-model-cache\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489377 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvvz2\" (UniqueName: \"kubernetes.io/projected/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kube-api-access-vvvz2\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489377 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-tls-certs\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489618 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-home\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489690 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489622 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-model-cache\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.489690 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.489653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.491665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.491637 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-dshm\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.491859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.491838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-tls-certs\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.510306 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.510274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvvz2\" (UniqueName: \"kubernetes.io/projected/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kube-api-access-vvvz2\") pod \"custom-route-timeout-test-kserve-65bd757bc8-w8mp6\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.603963 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.603869 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:37.748451 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.748406 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6"] Apr 16 22:30:37.750970 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:30:37.750931 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73e7bc45_9f0e_4097_842f_8ac6f143dbb3.slice/crio-a4dafc451d0064368d58dd4849ab94d9e3a95e64a9102740f7a42a8c96a55dbb WatchSource:0}: Error finding container a4dafc451d0064368d58dd4849ab94d9e3a95e64a9102740f7a42a8c96a55dbb: Status 404 returned error can't find the container with id a4dafc451d0064368d58dd4849ab94d9e3a95e64a9102740f7a42a8c96a55dbb Apr 16 22:30:37.986368 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.986263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" event={"ID":"73e7bc45-9f0e-4097-842f-8ac6f143dbb3","Type":"ContainerStarted","Data":"aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b"} Apr 16 22:30:37.986368 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:37.986319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" event={"ID":"73e7bc45-9f0e-4097-842f-8ac6f143dbb3","Type":"ContainerStarted","Data":"a4dafc451d0064368d58dd4849ab94d9e3a95e64a9102740f7a42a8c96a55dbb"} Apr 16 22:30:42.001257 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:42.001216 2572 generic.go:358] "Generic (PLEG): container finished" podID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerID="aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b" exitCode=0 Apr 16 22:30:42.001667 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:42.001291 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" event={"ID":"73e7bc45-9f0e-4097-842f-8ac6f143dbb3","Type":"ContainerDied","Data":"aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b"} Apr 16 22:30:43.006575 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:43.006533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" event={"ID":"73e7bc45-9f0e-4097-842f-8ac6f143dbb3","Type":"ContainerStarted","Data":"3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19"} Apr 16 22:30:43.034638 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:43.034560 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podStartSLOduration=6.034537379 podStartE2EDuration="6.034537379s" podCreationTimestamp="2026-04-16 22:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:30:43.027425084 +0000 UTC m=+1023.252591464" watchObservedRunningTime="2026-04-16 22:30:43.034537379 +0000 UTC m=+1023.259703759" Apr 16 22:30:45.201043 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:45.200991 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 22:30:47.604785 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:47.604725 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:47.604785 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:47.604781 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:30:47.606216 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:47.606178 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:30:55.211285 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:55.211244 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:30:55.219707 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:55.219669 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:30:57.604802 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:57.604742 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:30:58.947639 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:58.947603 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg"] Apr 16 22:30:58.948093 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:58.947857 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" containerID="cri-o://4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e" gracePeriod=30 Apr 16 22:30:59.069403 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.069370 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt_f6c33c4b-eb25-427e-9928-1b7e11ec8960/main/0.log" Apr 16 22:30:59.069727 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.069694 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerID="ad6ea101953be7b9e263fb15ade5c72f1093bc222a8664c1dc3eb403505b02fc" exitCode=137 Apr 16 22:30:59.069822 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.069754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" event={"ID":"f6c33c4b-eb25-427e-9928-1b7e11ec8960","Type":"ContainerDied","Data":"ad6ea101953be7b9e263fb15ade5c72f1093bc222a8664c1dc3eb403505b02fc"} Apr 16 22:30:59.098023 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.097996 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt_f6c33c4b-eb25-427e-9928-1b7e11ec8960/main/0.log" Apr 16 22:30:59.098412 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.098393 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:30:59.193581 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.193545 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq95m\" (UniqueName: \"kubernetes.io/projected/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kube-api-access-xq95m\") pod \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " Apr 16 22:30:59.193581 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.193591 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kserve-provision-location\") pod \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " Apr 16 22:30:59.193841 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.193633 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-home\") pod \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " Apr 16 22:30:59.193841 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.193677 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6c33c4b-eb25-427e-9928-1b7e11ec8960-tls-certs\") pod \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " Apr 16 22:30:59.193841 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.193729 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-dshm\") pod \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " Apr 16 22:30:59.193841 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.193776 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-model-cache\") pod \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\" (UID: \"f6c33c4b-eb25-427e-9928-1b7e11ec8960\") " Apr 16 22:30:59.194053 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.194010 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-home" (OuterVolumeSpecName: "home") pod "f6c33c4b-eb25-427e-9928-1b7e11ec8960" (UID: "f6c33c4b-eb25-427e-9928-1b7e11ec8960"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:30:59.194295 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.194263 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-model-cache" (OuterVolumeSpecName: "model-cache") pod "f6c33c4b-eb25-427e-9928-1b7e11ec8960" (UID: "f6c33c4b-eb25-427e-9928-1b7e11ec8960"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:30:59.196170 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.196134 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kube-api-access-xq95m" (OuterVolumeSpecName: "kube-api-access-xq95m") pod "f6c33c4b-eb25-427e-9928-1b7e11ec8960" (UID: "f6c33c4b-eb25-427e-9928-1b7e11ec8960"). InnerVolumeSpecName "kube-api-access-xq95m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:30:59.196379 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.196350 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-dshm" (OuterVolumeSpecName: "dshm") pod "f6c33c4b-eb25-427e-9928-1b7e11ec8960" (UID: "f6c33c4b-eb25-427e-9928-1b7e11ec8960"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:30:59.196552 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.196530 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6c33c4b-eb25-427e-9928-1b7e11ec8960-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f6c33c4b-eb25-427e-9928-1b7e11ec8960" (UID: "f6c33c4b-eb25-427e-9928-1b7e11ec8960"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:30:59.249696 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.249631 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f6c33c4b-eb25-427e-9928-1b7e11ec8960" (UID: "f6c33c4b-eb25-427e-9928-1b7e11ec8960"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:30:59.295510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.295469 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f6c33c4b-eb25-427e-9928-1b7e11ec8960-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:30:59.295510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.295499 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:30:59.295510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.295508 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:30:59.295884 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.295523 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xq95m\" (UniqueName: \"kubernetes.io/projected/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kube-api-access-xq95m\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:30:59.295884 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.295536 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:30:59.295884 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:30:59.295545 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f6c33c4b-eb25-427e-9928-1b7e11ec8960-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:31:00.075207 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:00.075171 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt_f6c33c4b-eb25-427e-9928-1b7e11ec8960/main/0.log" Apr 16 22:31:00.075694 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:00.075491 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" event={"ID":"f6c33c4b-eb25-427e-9928-1b7e11ec8960","Type":"ContainerDied","Data":"3e1506a88d0def5fc41e866d14a981341d459db1b0c24190a9245279474eb8cd"} Apr 16 22:31:00.075694 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:00.075530 2572 scope.go:117] "RemoveContainer" containerID="ad6ea101953be7b9e263fb15ade5c72f1093bc222a8664c1dc3eb403505b02fc" Apr 16 22:31:00.075694 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:00.075553 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt" Apr 16 22:31:00.099919 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:00.098956 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt"] Apr 16 22:31:00.099994 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:00.099917 2572 scope.go:117] "RemoveContainer" containerID="dd253587c736978fbcb5c9df91c376d1fda33c5bccfbfe83be888eef5b7fa1e9" Apr 16 22:31:00.102477 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:00.102452 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-b6c7b4cb-b4vdt"] Apr 16 22:31:00.444700 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:00.444616 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" path="/var/lib/kubelet/pods/f6c33c4b-eb25-427e-9928-1b7e11ec8960/volumes" Apr 16 22:31:07.605154 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:07.605074 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:31:17.604864 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:17.604815 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:31:27.604897 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:27.604845 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:31:29.172236 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.172209 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6b98876994-fdvmg_622aa5dd-a4e0-4b4e-b249-713da19c6058/main/0.log" Apr 16 22:31:29.172598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.172583 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:31:29.178408 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.178390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6b98876994-fdvmg_622aa5dd-a4e0-4b4e-b249-713da19c6058/main/0.log" Apr 16 22:31:29.178727 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.178705 2572 generic.go:358] "Generic (PLEG): container finished" podID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerID="4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e" exitCode=137 Apr 16 22:31:29.178817 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.178759 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" event={"ID":"622aa5dd-a4e0-4b4e-b249-713da19c6058","Type":"ContainerDied","Data":"4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e"} Apr 16 22:31:29.178817 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.178774 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" Apr 16 22:31:29.178817 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.178788 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg" event={"ID":"622aa5dd-a4e0-4b4e-b249-713da19c6058","Type":"ContainerDied","Data":"8f95488ab39e2e3cd08967685deb94494a4c800bb54b868e1a59ebc02f9b52cb"} Apr 16 22:31:29.178817 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.178805 2572 scope.go:117] "RemoveContainer" containerID="4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e" Apr 16 22:31:29.208339 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.207353 2572 scope.go:117] "RemoveContainer" containerID="1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4" Apr 16 22:31:29.275426 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.275400 2572 scope.go:117] "RemoveContainer" containerID="4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e" Apr 16 22:31:29.275750 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:31:29.275722 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e\": container with ID starting with 4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e not found: ID does not exist" containerID="4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e" Apr 16 22:31:29.275799 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.275764 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e"} err="failed to get container status \"4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e\": rpc error: code = NotFound desc = could not find container \"4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e\": container with ID starting with 4041e9b3a97e259e2ed3961312e4c2b043bef7d3849af9fb74df8801390e2d6e not found: ID does not exist" Apr 16 22:31:29.275799 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.275791 2572 scope.go:117] "RemoveContainer" containerID="1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4" Apr 16 22:31:29.276084 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:31:29.276067 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4\": container with ID starting with 1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4 not found: ID does not exist" containerID="1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4" Apr 16 22:31:29.276194 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.276095 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4"} err="failed to get container status \"1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4\": rpc error: code = NotFound desc = could not find container \"1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4\": container with ID starting with 1ab82f6f935079de16c37c35fd7579f1ec856c81f422beae0a6d8da98809eed4 not found: ID does not exist" Apr 16 22:31:29.282485 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282465 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-home\") pod \"622aa5dd-a4e0-4b4e-b249-713da19c6058\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " Apr 16 22:31:29.282547 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282536 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv6lx\" (UniqueName: \"kubernetes.io/projected/622aa5dd-a4e0-4b4e-b249-713da19c6058-kube-api-access-lv6lx\") pod \"622aa5dd-a4e0-4b4e-b249-713da19c6058\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " Apr 16 22:31:29.282583 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282571 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-kserve-provision-location\") pod \"622aa5dd-a4e0-4b4e-b249-713da19c6058\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " Apr 16 22:31:29.282629 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282593 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-model-cache\") pod \"622aa5dd-a4e0-4b4e-b249-713da19c6058\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " Apr 16 22:31:29.282629 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282617 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-dshm\") pod \"622aa5dd-a4e0-4b4e-b249-713da19c6058\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " Apr 16 22:31:29.282728 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282700 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/622aa5dd-a4e0-4b4e-b249-713da19c6058-tls-certs\") pod \"622aa5dd-a4e0-4b4e-b249-713da19c6058\" (UID: \"622aa5dd-a4e0-4b4e-b249-713da19c6058\") " Apr 16 22:31:29.282929 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282888 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-model-cache" (OuterVolumeSpecName: "model-cache") pod "622aa5dd-a4e0-4b4e-b249-713da19c6058" (UID: "622aa5dd-a4e0-4b4e-b249-713da19c6058"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:29.282929 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282901 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-home" (OuterVolumeSpecName: "home") pod "622aa5dd-a4e0-4b4e-b249-713da19c6058" (UID: "622aa5dd-a4e0-4b4e-b249-713da19c6058"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:29.283070 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282978 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:31:29.283070 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.282995 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:31:29.284656 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.284628 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/622aa5dd-a4e0-4b4e-b249-713da19c6058-kube-api-access-lv6lx" (OuterVolumeSpecName: "kube-api-access-lv6lx") pod "622aa5dd-a4e0-4b4e-b249-713da19c6058" (UID: "622aa5dd-a4e0-4b4e-b249-713da19c6058"). InnerVolumeSpecName "kube-api-access-lv6lx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:31:29.284781 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.284759 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/622aa5dd-a4e0-4b4e-b249-713da19c6058-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "622aa5dd-a4e0-4b4e-b249-713da19c6058" (UID: "622aa5dd-a4e0-4b4e-b249-713da19c6058"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:31:29.284944 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.284924 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-dshm" (OuterVolumeSpecName: "dshm") pod "622aa5dd-a4e0-4b4e-b249-713da19c6058" (UID: "622aa5dd-a4e0-4b4e-b249-713da19c6058"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:29.335918 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.335866 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "622aa5dd-a4e0-4b4e-b249-713da19c6058" (UID: "622aa5dd-a4e0-4b4e-b249-713da19c6058"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:29.384540 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.384501 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:31:29.384540 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.384540 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/622aa5dd-a4e0-4b4e-b249-713da19c6058-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:31:29.384707 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.384559 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lv6lx\" (UniqueName: \"kubernetes.io/projected/622aa5dd-a4e0-4b4e-b249-713da19c6058-kube-api-access-lv6lx\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:31:29.384707 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.384574 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/622aa5dd-a4e0-4b4e-b249-713da19c6058-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:31:29.502544 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.502510 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg"] Apr 16 22:31:29.505526 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:29.505505 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-fdvmg"] Apr 16 22:31:30.445490 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:30.445456 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" path="/var/lib/kubelet/pods/622aa5dd-a4e0-4b4e-b249-713da19c6058/volumes" Apr 16 22:31:32.573566 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573526 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl"] Apr 16 22:31:32.573930 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573884 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="storage-initializer" Apr 16 22:31:32.573930 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573895 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="storage-initializer" Apr 16 22:31:32.573930 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573905 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" Apr 16 22:31:32.573930 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573911 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" Apr 16 22:31:32.573930 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573923 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" Apr 16 22:31:32.573930 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573933 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" Apr 16 22:31:32.574124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573946 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="storage-initializer" Apr 16 22:31:32.574124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.573952 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="storage-initializer" Apr 16 22:31:32.574124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.574010 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6c33c4b-eb25-427e-9928-1b7e11ec8960" containerName="main" Apr 16 22:31:32.574124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.574018 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="622aa5dd-a4e0-4b4e-b249-713da19c6058" containerName="main" Apr 16 22:31:32.578900 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.578880 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.581645 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.581620 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 22:31:32.587541 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.587515 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl"] Apr 16 22:31:32.715691 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.715658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7157f0e0-543a-4348-95d9-2450299a9c4a-tls-certs\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.715691 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.715693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghxz\" (UniqueName: \"kubernetes.io/projected/7157f0e0-543a-4348-95d9-2450299a9c4a-kube-api-access-9ghxz\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.715950 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.715719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-model-cache\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.715950 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.715782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-dshm\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.715950 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.715912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-kserve-provision-location\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.716083 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.715962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-home\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817027 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.816990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-home\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.817044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7157f0e0-543a-4348-95d9-2450299a9c4a-tls-certs\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.817070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghxz\" (UniqueName: \"kubernetes.io/projected/7157f0e0-543a-4348-95d9-2450299a9c4a-kube-api-access-9ghxz\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.817124 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-model-cache\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.817164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-dshm\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.817261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-kserve-provision-location\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817558 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.817443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-home\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817558 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.817496 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-model-cache\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.817558 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.817519 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-kserve-provision-location\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.819354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.819330 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-dshm\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.819635 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.819613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7157f0e0-543a-4348-95d9-2450299a9c4a-tls-certs\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.825499 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.825452 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghxz\" (UniqueName: \"kubernetes.io/projected/7157f0e0-543a-4348-95d9-2450299a9c4a-kube-api-access-9ghxz\") pod \"stop-feature-test-kserve-6b98876994-475pl\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:32.889406 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:32.889371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:33.029898 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:33.029791 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl"] Apr 16 22:31:33.032668 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:31:33.032642 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7157f0e0_543a_4348_95d9_2450299a9c4a.slice/crio-d3dced18f71737355f1520607d2953bf5409b2e3a8238b100b5929bb49019246 WatchSource:0}: Error finding container d3dced18f71737355f1520607d2953bf5409b2e3a8238b100b5929bb49019246: Status 404 returned error can't find the container with id d3dced18f71737355f1520607d2953bf5409b2e3a8238b100b5929bb49019246 Apr 16 22:31:33.034714 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:33.034696 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:31:33.194883 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:33.194840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" event={"ID":"7157f0e0-543a-4348-95d9-2450299a9c4a","Type":"ContainerStarted","Data":"d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66"} Apr 16 22:31:33.194883 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:33.194884 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" event={"ID":"7157f0e0-543a-4348-95d9-2450299a9c4a","Type":"ContainerStarted","Data":"d3dced18f71737355f1520607d2953bf5409b2e3a8238b100b5929bb49019246"} Apr 16 22:31:37.210616 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:37.210582 2572 generic.go:358] "Generic (PLEG): container finished" podID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerID="d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66" exitCode=0 Apr 16 22:31:37.210961 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:37.210645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" event={"ID":"7157f0e0-543a-4348-95d9-2450299a9c4a","Type":"ContainerDied","Data":"d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66"} Apr 16 22:31:37.605366 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:37.605313 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:31:38.215671 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:38.215631 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" event={"ID":"7157f0e0-543a-4348-95d9-2450299a9c4a","Type":"ContainerStarted","Data":"4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e"} Apr 16 22:31:38.237649 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:38.237588 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podStartSLOduration=6.237569541 podStartE2EDuration="6.237569541s" podCreationTimestamp="2026-04-16 22:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:31:38.235690126 +0000 UTC m=+1078.460856504" watchObservedRunningTime="2026-04-16 22:31:38.237569541 +0000 UTC m=+1078.462735922" Apr 16 22:31:42.890265 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:42.890226 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:42.890265 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:42.890274 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:31:42.891983 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:42.891945 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:31:47.604832 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:47.604775 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:31:52.890123 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:52.890066 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:31:57.605330 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:31:57.605280 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:32:02.890762 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:02.890711 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:32:07.604746 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:07.604702 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 22:32:12.890397 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:12.890350 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:32:17.614326 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:17.614287 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:32:17.621884 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:17.621855 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:32:22.889810 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:22.889760 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:32:23.690126 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:23.690078 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6"] Apr 16 22:32:23.690598 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:23.690519 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" containerID="cri-o://3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19" gracePeriod=30 Apr 16 22:32:32.890466 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:32.890423 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:32:35.676823 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.676788 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th"] Apr 16 22:32:35.680843 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.680818 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.683726 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.683705 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 22:32:35.694344 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.694314 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th"] Apr 16 22:32:35.816654 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.816618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-dshm\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.816814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.816665 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-home\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.816814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.816710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-kserve-provision-location\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.816814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.816728 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb7bw\" (UniqueName: \"kubernetes.io/projected/4c0128a6-01ea-4f34-af8a-568ad5692849-kube-api-access-kb7bw\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.816814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.816761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c0128a6-01ea-4f34-af8a-568ad5692849-tls-certs\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.816948 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.816874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-model-cache\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.917593 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.917553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c0128a6-01ea-4f34-af8a-568ad5692849-tls-certs\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.917769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.917616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-model-cache\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.917769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.917667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-dshm\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.917769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.917711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-home\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.917769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.917749 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-kserve-provision-location\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.917965 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.917772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb7bw\" (UniqueName: \"kubernetes.io/projected/4c0128a6-01ea-4f34-af8a-568ad5692849-kube-api-access-kb7bw\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.918220 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.918193 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-model-cache\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.918331 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.918217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-home\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.918331 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.918259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-kserve-provision-location\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.919952 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.919928 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-dshm\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.920070 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.920052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c0128a6-01ea-4f34-af8a-568ad5692849-tls-certs\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.926427 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.926403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb7bw\" (UniqueName: \"kubernetes.io/projected/4c0128a6-01ea-4f34-af8a-568ad5692849-kube-api-access-kb7bw\") pod \"router-with-refs-test-kserve-99d4b7f4-dc5th\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:35.994605 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:35.994517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:36.138360 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:36.138236 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th"] Apr 16 22:32:36.143595 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:32:36.143553 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0128a6_01ea_4f34_af8a_568ad5692849.slice/crio-b2e4c153a7b37c46a81dc940223067585139ce1f3c1713fb58cb52a093c785fc WatchSource:0}: Error finding container b2e4c153a7b37c46a81dc940223067585139ce1f3c1713fb58cb52a093c785fc: Status 404 returned error can't find the container with id b2e4c153a7b37c46a81dc940223067585139ce1f3c1713fb58cb52a093c785fc Apr 16 22:32:36.436851 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:36.436812 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" event={"ID":"4c0128a6-01ea-4f34-af8a-568ad5692849","Type":"ContainerStarted","Data":"3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f"} Apr 16 22:32:36.436851 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:36.436857 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" event={"ID":"4c0128a6-01ea-4f34-af8a-568ad5692849","Type":"ContainerStarted","Data":"b2e4c153a7b37c46a81dc940223067585139ce1f3c1713fb58cb52a093c785fc"} Apr 16 22:32:40.454417 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:40.454385 2572 generic.go:358] "Generic (PLEG): container finished" podID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerID="3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f" exitCode=0 Apr 16 22:32:40.454756 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:40.454463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" event={"ID":"4c0128a6-01ea-4f34-af8a-568ad5692849","Type":"ContainerDied","Data":"3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f"} Apr 16 22:32:41.463412 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:41.463376 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" event={"ID":"4c0128a6-01ea-4f34-af8a-568ad5692849","Type":"ContainerStarted","Data":"f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e"} Apr 16 22:32:41.484223 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:41.484135 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podStartSLOduration=6.484094197 podStartE2EDuration="6.484094197s" podCreationTimestamp="2026-04-16 22:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:32:41.482210886 +0000 UTC m=+1141.707377266" watchObservedRunningTime="2026-04-16 22:32:41.484094197 +0000 UTC m=+1141.709260575" Apr 16 22:32:42.890166 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:42.890117 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:32:45.994799 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:45.994755 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:45.994799 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:45.994802 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:32:45.996061 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:45.996032 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:32:52.890217 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:52.890100 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:32:54.009661 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.009624 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-65bd757bc8-w8mp6_73e7bc45-9f0e-4097-842f-8ac6f143dbb3/main/0.log" Apr 16 22:32:54.010031 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.010017 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:32:54.083781 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.083735 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-tls-certs\") pod \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " Apr 16 22:32:54.083973 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.083794 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-dshm\") pod \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " Apr 16 22:32:54.083973 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.083892 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvvz2\" (UniqueName: \"kubernetes.io/projected/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kube-api-access-vvvz2\") pod \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " Apr 16 22:32:54.083973 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.083954 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kserve-provision-location\") pod \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " Apr 16 22:32:54.084158 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.084030 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-model-cache\") pod \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " Apr 16 22:32:54.084158 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.084061 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-home\") pod \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\" (UID: \"73e7bc45-9f0e-4097-842f-8ac6f143dbb3\") " Apr 16 22:32:54.084645 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.084618 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-home" (OuterVolumeSpecName: "home") pod "73e7bc45-9f0e-4097-842f-8ac6f143dbb3" (UID: "73e7bc45-9f0e-4097-842f-8ac6f143dbb3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:32:54.084857 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.084804 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-model-cache" (OuterVolumeSpecName: "model-cache") pod "73e7bc45-9f0e-4097-842f-8ac6f143dbb3" (UID: "73e7bc45-9f0e-4097-842f-8ac6f143dbb3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:32:54.086373 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.086266 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "73e7bc45-9f0e-4097-842f-8ac6f143dbb3" (UID: "73e7bc45-9f0e-4097-842f-8ac6f143dbb3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:32:54.086481 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.086400 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kube-api-access-vvvz2" (OuterVolumeSpecName: "kube-api-access-vvvz2") pod "73e7bc45-9f0e-4097-842f-8ac6f143dbb3" (UID: "73e7bc45-9f0e-4097-842f-8ac6f143dbb3"). InnerVolumeSpecName "kube-api-access-vvvz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:32:54.086703 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.086672 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-dshm" (OuterVolumeSpecName: "dshm") pod "73e7bc45-9f0e-4097-842f-8ac6f143dbb3" (UID: "73e7bc45-9f0e-4097-842f-8ac6f143dbb3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:32:54.136952 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.136902 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73e7bc45-9f0e-4097-842f-8ac6f143dbb3" (UID: "73e7bc45-9f0e-4097-842f-8ac6f143dbb3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:32:54.185180 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.185145 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:32:54.185302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.185181 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:32:54.185302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.185209 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vvvz2\" (UniqueName: \"kubernetes.io/projected/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kube-api-access-vvvz2\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:32:54.185302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.185223 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:32:54.185302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.185237 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:32:54.185302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.185251 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/73e7bc45-9f0e-4097-842f-8ac6f143dbb3-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:32:54.511345 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.511319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-65bd757bc8-w8mp6_73e7bc45-9f0e-4097-842f-8ac6f143dbb3/main/0.log" Apr 16 22:32:54.511715 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.511691 2572 generic.go:358] "Generic (PLEG): container finished" podID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerID="3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19" exitCode=137 Apr 16 22:32:54.511829 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.511759 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" Apr 16 22:32:54.511829 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.511774 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" event={"ID":"73e7bc45-9f0e-4097-842f-8ac6f143dbb3","Type":"ContainerDied","Data":"3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19"} Apr 16 22:32:54.511829 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.511825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6" event={"ID":"73e7bc45-9f0e-4097-842f-8ac6f143dbb3","Type":"ContainerDied","Data":"a4dafc451d0064368d58dd4849ab94d9e3a95e64a9102740f7a42a8c96a55dbb"} Apr 16 22:32:54.511938 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.511845 2572 scope.go:117] "RemoveContainer" containerID="3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19" Apr 16 22:32:54.530422 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.530401 2572 scope.go:117] "RemoveContainer" containerID="aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b" Apr 16 22:32:54.531798 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.531773 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6"] Apr 16 22:32:54.534761 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.534740 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-65bd757bc8-w8mp6"] Apr 16 22:32:54.588769 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.588733 2572 scope.go:117] "RemoveContainer" containerID="3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19" Apr 16 22:32:54.589135 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:32:54.589089 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19\": container with ID starting with 3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19 not found: ID does not exist" containerID="3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19" Apr 16 22:32:54.589231 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.589148 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19"} err="failed to get container status \"3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19\": rpc error: code = NotFound desc = could not find container \"3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19\": container with ID starting with 3a585d818bde26ea12370bcb6786e896664cc85c5b2ecb22808a7c84d7accb19 not found: ID does not exist" Apr 16 22:32:54.589231 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.589174 2572 scope.go:117] "RemoveContainer" containerID="aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b" Apr 16 22:32:54.589453 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:32:54.589433 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b\": container with ID starting with aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b not found: ID does not exist" containerID="aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b" Apr 16 22:32:54.589515 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:54.589463 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b"} err="failed to get container status \"aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b\": rpc error: code = NotFound desc = could not find container \"aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b\": container with ID starting with aba874b4386f67ff90136e21f2b2e31ca808e81bbcf03fa423772ad1f337320b not found: ID does not exist" Apr 16 22:32:55.996124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:55.996064 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:32:56.445098 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:32:56.445043 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" path="/var/lib/kubelet/pods/73e7bc45-9f0e-4097-842f-8ac6f143dbb3/volumes" Apr 16 22:33:02.890407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:02.890359 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 22:33:05.995323 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:05.995279 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:33:12.900389 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:12.900356 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:33:12.908040 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:12.908014 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:33:14.141830 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:14.141799 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl"] Apr 16 22:33:14.582146 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:14.582062 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" containerID="cri-o://4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e" gracePeriod=30 Apr 16 22:33:15.995897 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:15.995855 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:33:25.995474 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:25.995432 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:33:35.995328 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:35.995277 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:33:40.419933 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:40.419904 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:33:40.422017 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:40.421994 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:33:44.841621 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.841556 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6b98876994-475pl_7157f0e0-543a-4348-95d9-2450299a9c4a/main/0.log" Apr 16 22:33:44.841983 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.841951 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:33:44.960404 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960371 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-home\") pod \"7157f0e0-543a-4348-95d9-2450299a9c4a\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " Apr 16 22:33:44.960404 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960412 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-model-cache\") pod \"7157f0e0-543a-4348-95d9-2450299a9c4a\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " Apr 16 22:33:44.960642 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960440 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-kserve-provision-location\") pod \"7157f0e0-543a-4348-95d9-2450299a9c4a\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " Apr 16 22:33:44.960642 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960471 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7157f0e0-543a-4348-95d9-2450299a9c4a-tls-certs\") pod \"7157f0e0-543a-4348-95d9-2450299a9c4a\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " Apr 16 22:33:44.960642 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960491 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ghxz\" (UniqueName: \"kubernetes.io/projected/7157f0e0-543a-4348-95d9-2450299a9c4a-kube-api-access-9ghxz\") pod \"7157f0e0-543a-4348-95d9-2450299a9c4a\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " Apr 16 22:33:44.960642 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960562 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-dshm\") pod \"7157f0e0-543a-4348-95d9-2450299a9c4a\" (UID: \"7157f0e0-543a-4348-95d9-2450299a9c4a\") " Apr 16 22:33:44.960887 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960698 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-model-cache" (OuterVolumeSpecName: "model-cache") pod "7157f0e0-543a-4348-95d9-2450299a9c4a" (UID: "7157f0e0-543a-4348-95d9-2450299a9c4a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:44.960887 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960800 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-home" (OuterVolumeSpecName: "home") pod "7157f0e0-543a-4348-95d9-2450299a9c4a" (UID: "7157f0e0-543a-4348-95d9-2450299a9c4a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:44.960887 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.960842 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:33:44.962742 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.962710 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7157f0e0-543a-4348-95d9-2450299a9c4a-kube-api-access-9ghxz" (OuterVolumeSpecName: "kube-api-access-9ghxz") pod "7157f0e0-543a-4348-95d9-2450299a9c4a" (UID: "7157f0e0-543a-4348-95d9-2450299a9c4a"). InnerVolumeSpecName "kube-api-access-9ghxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:33:44.962937 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.962909 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7157f0e0-543a-4348-95d9-2450299a9c4a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7157f0e0-543a-4348-95d9-2450299a9c4a" (UID: "7157f0e0-543a-4348-95d9-2450299a9c4a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:33:44.962981 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:44.962934 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-dshm" (OuterVolumeSpecName: "dshm") pod "7157f0e0-543a-4348-95d9-2450299a9c4a" (UID: "7157f0e0-543a-4348-95d9-2450299a9c4a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:45.016275 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.016211 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7157f0e0-543a-4348-95d9-2450299a9c4a" (UID: "7157f0e0-543a-4348-95d9-2450299a9c4a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:45.061561 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.061525 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:33:45.061561 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.061558 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:33:45.061718 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.061568 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7157f0e0-543a-4348-95d9-2450299a9c4a-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:33:45.061718 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.061578 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7157f0e0-543a-4348-95d9-2450299a9c4a-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:33:45.061718 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.061587 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9ghxz\" (UniqueName: \"kubernetes.io/projected/7157f0e0-543a-4348-95d9-2450299a9c4a-kube-api-access-9ghxz\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:33:45.708071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.708040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6b98876994-475pl_7157f0e0-543a-4348-95d9-2450299a9c4a/main/0.log" Apr 16 22:33:45.708454 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.708429 2572 generic.go:358] "Generic (PLEG): container finished" podID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerID="4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e" exitCode=137 Apr 16 22:33:45.708545 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.708509 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" Apr 16 22:33:45.708606 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.708507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" event={"ID":"7157f0e0-543a-4348-95d9-2450299a9c4a","Type":"ContainerDied","Data":"4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e"} Apr 16 22:33:45.708661 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.708610 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl" event={"ID":"7157f0e0-543a-4348-95d9-2450299a9c4a","Type":"ContainerDied","Data":"d3dced18f71737355f1520607d2953bf5409b2e3a8238b100b5929bb49019246"} Apr 16 22:33:45.708661 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.708627 2572 scope.go:117] "RemoveContainer" containerID="4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e" Apr 16 22:33:45.727965 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.727943 2572 scope.go:117] "RemoveContainer" containerID="d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66" Apr 16 22:33:45.731223 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.731199 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl"] Apr 16 22:33:45.734032 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.734012 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6b98876994-475pl"] Apr 16 22:33:45.738561 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.738543 2572 scope.go:117] "RemoveContainer" containerID="4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e" Apr 16 22:33:45.738828 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:33:45.738810 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e\": container with ID starting with 4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e not found: ID does not exist" containerID="4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e" Apr 16 22:33:45.738906 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.738834 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e"} err="failed to get container status \"4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e\": rpc error: code = NotFound desc = could not find container \"4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e\": container with ID starting with 4386a5b493ca62956722ae7763b732e3913cd12002c2c180b45dcdb19383b17e not found: ID does not exist" Apr 16 22:33:45.738906 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.738850 2572 scope.go:117] "RemoveContainer" containerID="d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66" Apr 16 22:33:45.739069 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:33:45.739054 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66\": container with ID starting with d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66 not found: ID does not exist" containerID="d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66" Apr 16 22:33:45.739125 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.739074 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66"} err="failed to get container status \"d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66\": rpc error: code = NotFound desc = could not find container \"d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66\": container with ID starting with d1c8c46375e3a2b392754d540dda44c16d12a1c7be0d0bd768de9905d59bec66 not found: ID does not exist" Apr 16 22:33:45.995535 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:45.995436 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:33:46.444093 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:46.444062 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" path="/var/lib/kubelet/pods/7157f0e0-543a-4348-95d9-2450299a9c4a/volumes" Apr 16 22:33:55.995470 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:33:55.995421 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:34:05.995197 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:05.995154 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8000/health\": dial tcp 10.134.0.38:8000: connect: connection refused" Apr 16 22:34:16.005028 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:16.004995 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:34:16.012707 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:16.012685 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:34:22.214322 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:22.214236 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th"] Apr 16 22:34:22.216962 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:22.214577 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" containerID="cri-o://f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e" gracePeriod=30 Apr 16 22:34:29.593695 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.593660 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4"] Apr 16 22:34:29.594082 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594067 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" Apr 16 22:34:29.594142 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594084 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" Apr 16 22:34:29.594142 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594096 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="storage-initializer" Apr 16 22:34:29.594142 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594117 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="storage-initializer" Apr 16 22:34:29.594142 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594130 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="storage-initializer" Apr 16 22:34:29.594142 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594138 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="storage-initializer" Apr 16 22:34:29.594300 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594158 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" Apr 16 22:34:29.594300 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594163 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" Apr 16 22:34:29.594300 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594226 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="73e7bc45-9f0e-4097-842f-8ac6f143dbb3" containerName="main" Apr 16 22:34:29.594300 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.594236 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7157f0e0-543a-4348-95d9-2450299a9c4a" containerName="main" Apr 16 22:34:29.604301 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.604277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.607053 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.607030 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 22:34:29.608719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.608695 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4"] Apr 16 22:34:29.647914 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.647876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.648098 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.647928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.648098 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.647984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba077ded-2e20-4109-9bf6-f611e7ce1166-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.648098 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.648042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64rx9\" (UniqueName: \"kubernetes.io/projected/ba077ded-2e20-4109-9bf6-f611e7ce1166-kube-api-access-64rx9\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.648098 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.648068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.648313 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.648169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.748858 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.748818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.749031 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.748868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba077ded-2e20-4109-9bf6-f611e7ce1166-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.749031 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.748935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64rx9\" (UniqueName: \"kubernetes.io/projected/ba077ded-2e20-4109-9bf6-f611e7ce1166-kube-api-access-64rx9\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.749031 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.748965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.749031 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.749013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.749269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.749176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.749355 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.749333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.749471 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.749413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.749471 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.749440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.751236 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.751213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.751374 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.751357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba077ded-2e20-4109-9bf6-f611e7ce1166-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.756164 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.756143 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64rx9\" (UniqueName: \"kubernetes.io/projected/ba077ded-2e20-4109-9bf6-f611e7ce1166-kube-api-access-64rx9\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:29.915059 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:29.914975 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:30.039113 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:30.039078 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4"] Apr 16 22:34:30.041137 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:34:30.041078 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba077ded_2e20_4109_9bf6_f611e7ce1166.slice/crio-babdc1f12128aa9789b6d85efbef6cf5ed8c8c01536c776d700e6abbd512afeb WatchSource:0}: Error finding container babdc1f12128aa9789b6d85efbef6cf5ed8c8c01536c776d700e6abbd512afeb: Status 404 returned error can't find the container with id babdc1f12128aa9789b6d85efbef6cf5ed8c8c01536c776d700e6abbd512afeb Apr 16 22:34:30.867674 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:30.867636 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" event={"ID":"ba077ded-2e20-4109-9bf6-f611e7ce1166","Type":"ContainerStarted","Data":"6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657"} Apr 16 22:34:30.867674 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:30.867678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" event={"ID":"ba077ded-2e20-4109-9bf6-f611e7ce1166","Type":"ContainerStarted","Data":"babdc1f12128aa9789b6d85efbef6cf5ed8c8c01536c776d700e6abbd512afeb"} Apr 16 22:34:34.882644 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:34.882605 2572 generic.go:358] "Generic (PLEG): container finished" podID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerID="6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657" exitCode=0 Apr 16 22:34:34.883097 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:34.882658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" event={"ID":"ba077ded-2e20-4109-9bf6-f611e7ce1166","Type":"ContainerDied","Data":"6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657"} Apr 16 22:34:35.888313 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:35.888275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" event={"ID":"ba077ded-2e20-4109-9bf6-f611e7ce1166","Type":"ContainerStarted","Data":"2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3"} Apr 16 22:34:35.910032 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:35.909984 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podStartSLOduration=6.90997193 podStartE2EDuration="6.90997193s" podCreationTimestamp="2026-04-16 22:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:34:35.908012852 +0000 UTC m=+1256.133179232" watchObservedRunningTime="2026-04-16 22:34:35.90997193 +0000 UTC m=+1256.135138309" Apr 16 22:34:39.916051 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:39.916018 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:39.916051 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:39.916049 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:34:39.917681 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:39.917651 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:34:41.096887 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.096738 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs"] Apr 16 22:34:41.104268 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.104239 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9"] Apr 16 22:34:41.104435 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.104412 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.108249 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.108228 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 22:34:41.108375 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.108281 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-jwdcm\"" Apr 16 22:34:41.109214 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.109197 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.116916 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.116895 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs"] Apr 16 22:34:41.134654 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.134624 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9"] Apr 16 22:34:41.152932 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.152899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.153087 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.152936 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.153087 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.152963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.153087 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw6l2\" (UniqueName: \"kubernetes.io/projected/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kube-api-access-qw6l2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.153087 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.153087 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.153375 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e377d24-a06e-4cfa-b31f-f7cb1731b151-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.153375 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153267 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfrvw\" (UniqueName: \"kubernetes.io/projected/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kube-api-access-xfrvw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.153375 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.153375 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.153563 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.153563 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.153454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254081 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.254289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254184 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254207 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw6l2\" (UniqueName: \"kubernetes.io/projected/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kube-api-access-qw6l2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.254289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.254608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254510 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.254608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.254608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e377d24-a06e-4cfa-b31f-f7cb1731b151-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.254608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254586 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254855 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfrvw\" (UniqueName: \"kubernetes.io/projected/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kube-api-access-xfrvw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254855 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.254855 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.254855 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.254796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.255116 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.255066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.255248 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.255181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.256982 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.256952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.257086 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.256999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.257086 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.257055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e377d24-a06e-4cfa-b31f-f7cb1731b151-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.257226 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.257100 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.262368 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.262337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw6l2\" (UniqueName: \"kubernetes.io/projected/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kube-api-access-qw6l2\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.262617 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.262596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfrvw\" (UniqueName: \"kubernetes.io/projected/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kube-api-access-xfrvw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.420259 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.420168 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:41.426975 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.426951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:41.564412 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.564381 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs"] Apr 16 22:34:41.565699 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:34:41.565675 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e377d24_a06e_4cfa_b31f_f7cb1731b151.slice/crio-41e58de175784e9b4d8f82fb36bad01220e3eacd7fa5932ef765770d663c1045 WatchSource:0}: Error finding container 41e58de175784e9b4d8f82fb36bad01220e3eacd7fa5932ef765770d663c1045: Status 404 returned error can't find the container with id 41e58de175784e9b4d8f82fb36bad01220e3eacd7fa5932ef765770d663c1045 Apr 16 22:34:41.581980 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.581956 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9"] Apr 16 22:34:41.583276 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:34:41.583249 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9047af0_fa0c_4323_80c4_7e1bdd8cc483.slice/crio-5ea8d8dd951120a4543e74ee4919036858fde041745bfe05bec702fe022f857c WatchSource:0}: Error finding container 5ea8d8dd951120a4543e74ee4919036858fde041745bfe05bec702fe022f857c: Status 404 returned error can't find the container with id 5ea8d8dd951120a4543e74ee4919036858fde041745bfe05bec702fe022f857c Apr 16 22:34:41.910019 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.909943 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" event={"ID":"1e377d24-a06e-4cfa-b31f-f7cb1731b151","Type":"ContainerStarted","Data":"41e58de175784e9b4d8f82fb36bad01220e3eacd7fa5932ef765770d663c1045"} Apr 16 22:34:41.911738 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.911700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" event={"ID":"a9047af0-fa0c-4323-80c4-7e1bdd8cc483","Type":"ContainerStarted","Data":"65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb"} Apr 16 22:34:41.911882 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:41.911743 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" event={"ID":"a9047af0-fa0c-4323-80c4-7e1bdd8cc483","Type":"ContainerStarted","Data":"5ea8d8dd951120a4543e74ee4919036858fde041745bfe05bec702fe022f857c"} Apr 16 22:34:43.921864 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:43.921821 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" event={"ID":"1e377d24-a06e-4cfa-b31f-f7cb1731b151","Type":"ContainerStarted","Data":"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1"} Apr 16 22:34:43.922327 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:43.921882 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:44.927936 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:44.927900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" event={"ID":"1e377d24-a06e-4cfa-b31f-f7cb1731b151","Type":"ContainerStarted","Data":"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a"} Apr 16 22:34:45.933619 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:45.933573 2572 generic.go:358] "Generic (PLEG): container finished" podID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerID="65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb" exitCode=0 Apr 16 22:34:45.933982 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:45.933645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" event={"ID":"a9047af0-fa0c-4323-80c4-7e1bdd8cc483","Type":"ContainerDied","Data":"65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb"} Apr 16 22:34:46.940036 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:46.939987 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" event={"ID":"a9047af0-fa0c-4323-80c4-7e1bdd8cc483","Type":"ContainerStarted","Data":"e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644"} Apr 16 22:34:46.961908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:46.961846 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podStartSLOduration=5.961825815 podStartE2EDuration="5.961825815s" podCreationTimestamp="2026-04-16 22:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:34:46.959823534 +0000 UTC m=+1267.184989914" watchObservedRunningTime="2026-04-16 22:34:46.961825815 +0000 UTC m=+1267.186992195" Apr 16 22:34:48.953668 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:48.953629 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerID="f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a" exitCode=0 Apr 16 22:34:48.954161 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:48.953714 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" event={"ID":"1e377d24-a06e-4cfa-b31f-f7cb1731b151","Type":"ContainerDied","Data":"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a"} Apr 16 22:34:49.916426 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:49.916375 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:34:49.960024 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:49.959983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" event={"ID":"1e377d24-a06e-4cfa-b31f-f7cb1731b151","Type":"ContainerStarted","Data":"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a"} Apr 16 22:34:49.984985 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:49.984932 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podStartSLOduration=7.650281653 podStartE2EDuration="8.984916995s" podCreationTimestamp="2026-04-16 22:34:41 +0000 UTC" firstStartedPulling="2026-04-16 22:34:41.56791288 +0000 UTC m=+1261.793079237" lastFinishedPulling="2026-04-16 22:34:42.902548208 +0000 UTC m=+1263.127714579" observedRunningTime="2026-04-16 22:34:49.984736058 +0000 UTC m=+1270.209902437" watchObservedRunningTime="2026-04-16 22:34:49.984916995 +0000 UTC m=+1270.210083376" Apr 16 22:34:51.421214 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:51.421170 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:51.421214 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:51.421225 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:34:51.422927 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:51.422887 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:34:51.428068 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:51.428045 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:51.428194 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:51.428084 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:34:51.429630 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:51.429605 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:34:52.534507 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.534482 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-99d4b7f4-dc5th_4c0128a6-01ea-4f34-af8a-568ad5692849/main/0.log" Apr 16 22:34:52.534921 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.534864 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:34:52.667478 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.667436 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-home\") pod \"4c0128a6-01ea-4f34-af8a-568ad5692849\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " Apr 16 22:34:52.667665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.667490 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb7bw\" (UniqueName: \"kubernetes.io/projected/4c0128a6-01ea-4f34-af8a-568ad5692849-kube-api-access-kb7bw\") pod \"4c0128a6-01ea-4f34-af8a-568ad5692849\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " Apr 16 22:34:52.667665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.667536 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c0128a6-01ea-4f34-af8a-568ad5692849-tls-certs\") pod \"4c0128a6-01ea-4f34-af8a-568ad5692849\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " Apr 16 22:34:52.667665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.667601 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-dshm\") pod \"4c0128a6-01ea-4f34-af8a-568ad5692849\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " Apr 16 22:34:52.667836 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.667689 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-model-cache\") pod \"4c0128a6-01ea-4f34-af8a-568ad5692849\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " Apr 16 22:34:52.667836 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.667744 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-kserve-provision-location\") pod \"4c0128a6-01ea-4f34-af8a-568ad5692849\" (UID: \"4c0128a6-01ea-4f34-af8a-568ad5692849\") " Apr 16 22:34:52.667936 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.667825 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-home" (OuterVolumeSpecName: "home") pod "4c0128a6-01ea-4f34-af8a-568ad5692849" (UID: "4c0128a6-01ea-4f34-af8a-568ad5692849"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:52.668015 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.667991 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-model-cache" (OuterVolumeSpecName: "model-cache") pod "4c0128a6-01ea-4f34-af8a-568ad5692849" (UID: "4c0128a6-01ea-4f34-af8a-568ad5692849"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:52.668637 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.668614 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:34:52.668828 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.668813 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:34:52.670322 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.670293 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-dshm" (OuterVolumeSpecName: "dshm") pod "4c0128a6-01ea-4f34-af8a-568ad5692849" (UID: "4c0128a6-01ea-4f34-af8a-568ad5692849"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:52.670566 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.670540 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0128a6-01ea-4f34-af8a-568ad5692849-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4c0128a6-01ea-4f34-af8a-568ad5692849" (UID: "4c0128a6-01ea-4f34-af8a-568ad5692849"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:34:52.670671 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.670641 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0128a6-01ea-4f34-af8a-568ad5692849-kube-api-access-kb7bw" (OuterVolumeSpecName: "kube-api-access-kb7bw") pod "4c0128a6-01ea-4f34-af8a-568ad5692849" (UID: "4c0128a6-01ea-4f34-af8a-568ad5692849"). InnerVolumeSpecName "kube-api-access-kb7bw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:34:52.734415 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.734360 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4c0128a6-01ea-4f34-af8a-568ad5692849" (UID: "4c0128a6-01ea-4f34-af8a-568ad5692849"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:52.770305 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.770218 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:34:52.770305 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.770260 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kb7bw\" (UniqueName: \"kubernetes.io/projected/4c0128a6-01ea-4f34-af8a-568ad5692849-kube-api-access-kb7bw\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:34:52.770305 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.770277 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c0128a6-01ea-4f34-af8a-568ad5692849-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:34:52.770305 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.770292 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4c0128a6-01ea-4f34-af8a-568ad5692849-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:34:52.975685 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.975652 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-99d4b7f4-dc5th_4c0128a6-01ea-4f34-af8a-568ad5692849/main/0.log" Apr 16 22:34:52.976044 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.976019 2572 generic.go:358] "Generic (PLEG): container finished" podID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerID="f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e" exitCode=137 Apr 16 22:34:52.976199 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.976094 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" event={"ID":"4c0128a6-01ea-4f34-af8a-568ad5692849","Type":"ContainerDied","Data":"f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e"} Apr 16 22:34:52.976199 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.976137 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" Apr 16 22:34:52.976199 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.976152 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th" event={"ID":"4c0128a6-01ea-4f34-af8a-568ad5692849","Type":"ContainerDied","Data":"b2e4c153a7b37c46a81dc940223067585139ce1f3c1713fb58cb52a093c785fc"} Apr 16 22:34:52.976199 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:52.976175 2572 scope.go:117] "RemoveContainer" containerID="f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e" Apr 16 22:34:53.002866 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:53.002828 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th"] Apr 16 22:34:53.008908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:53.008839 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-99d4b7f4-dc5th"] Apr 16 22:34:53.008908 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:53.008879 2572 scope.go:117] "RemoveContainer" containerID="3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f" Apr 16 22:34:53.022102 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:53.022075 2572 scope.go:117] "RemoveContainer" containerID="f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e" Apr 16 22:34:53.022444 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:34:53.022419 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e\": container with ID starting with f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e not found: ID does not exist" containerID="f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e" Apr 16 22:34:53.022523 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:53.022455 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e"} err="failed to get container status \"f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e\": rpc error: code = NotFound desc = could not find container \"f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e\": container with ID starting with f2efc815644c7473bcb40aac6f7a5d56b1ce5a14d5f68abb8687801af221260e not found: ID does not exist" Apr 16 22:34:53.022523 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:53.022481 2572 scope.go:117] "RemoveContainer" containerID="3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f" Apr 16 22:34:53.022778 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:34:53.022757 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f\": container with ID starting with 3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f not found: ID does not exist" containerID="3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f" Apr 16 22:34:53.022834 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:53.022787 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f"} err="failed to get container status \"3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f\": rpc error: code = NotFound desc = could not find container \"3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f\": container with ID starting with 3eb5a610023e0af520499356c6ec36021be724e12e8dd0036bccfc65f58be70f not found: ID does not exist" Apr 16 22:34:54.456264 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:54.456224 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" path="/var/lib/kubelet/pods/4c0128a6-01ea-4f34-af8a-568ad5692849/volumes" Apr 16 22:34:59.916211 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:34:59.916160 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:35:01.421604 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:01.421549 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:35:01.427707 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:01.427663 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:35:01.434141 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:01.434092 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:35:09.916279 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:09.916222 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:35:11.420679 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:11.420630 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:35:11.428207 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:11.428154 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:35:19.915571 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:19.915518 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:35:21.420903 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:21.420852 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:35:21.428025 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:21.427993 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:35:29.916056 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:29.915997 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:35:31.421226 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:31.421167 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:35:31.428316 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:31.428238 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:35:39.915551 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:39.915499 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:35:41.420952 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:41.420904 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:35:41.427696 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:41.427663 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:35:49.915931 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:49.915836 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:35:51.420998 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:51.420945 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:35:51.427826 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:51.427792 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:35:59.915506 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:35:59.915461 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:36:01.421589 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:01.421527 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:36:01.427827 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:01.427786 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:36:09.916410 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:09.916367 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:36:11.420566 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:11.420517 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:36:11.427726 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:11.427690 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:36:19.916431 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:19.916379 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:36:21.421036 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:21.420989 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:36:21.428166 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:21.428127 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:36:29.916195 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:29.916148 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:36:31.420840 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:31.420790 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:36:31.427567 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:31.427524 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:36:39.915573 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:39.915521 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:36:41.421377 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:41.421314 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:36:41.427658 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:41.427623 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:36:49.916144 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:49.916081 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 22:36:51.420733 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:51.420684 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:36:51.428059 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:51.428024 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:36:59.925348 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:59.925308 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:36:59.933037 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:36:59.933011 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:37:01.420938 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:01.420889 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:37:01.427698 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:01.427665 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:37:10.672095 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:10.672059 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4"] Apr 16 22:37:10.672633 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:10.672445 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" containerID="cri-o://2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3" gracePeriod=30 Apr 16 22:37:11.421001 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:11.420958 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:37:11.428128 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:11.428076 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:37:17.390407 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.390366 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 22:37:17.390872 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.390779 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="storage-initializer" Apr 16 22:37:17.390872 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.390791 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="storage-initializer" Apr 16 22:37:17.390872 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.390810 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" Apr 16 22:37:17.390872 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.390815 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" Apr 16 22:37:17.391014 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.390883 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c0128a6-01ea-4f34-af8a-568ad5692849" containerName="main" Apr 16 22:37:17.396985 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.396966 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.399688 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.399660 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-l254r\"" Apr 16 22:37:17.399811 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.399664 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 22:37:17.405915 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.405890 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 22:37:17.444653 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.444615 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.444823 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.444726 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.444823 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.444811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lldbp\" (UniqueName: \"kubernetes.io/projected/792e8531-e78c-479f-9b40-685b5a393a1a-kube-api-access-lldbp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.444935 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.444859 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.444935 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.444922 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/792e8531-e78c-479f-9b40-685b5a393a1a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.445016 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.444954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.546481 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.546446 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.546672 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.546525 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lldbp\" (UniqueName: \"kubernetes.io/projected/792e8531-e78c-479f-9b40-685b5a393a1a-kube-api-access-lldbp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.546672 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.546567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.546672 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.546605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/792e8531-e78c-479f-9b40-685b5a393a1a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.546672 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.546636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.546886 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.546678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.546941 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.546902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.546997 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.546936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.547076 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.547055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.548832 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.548810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.549043 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.549028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/792e8531-e78c-479f-9b40-685b5a393a1a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.556452 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.556427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lldbp\" (UniqueName: \"kubernetes.io/projected/792e8531-e78c-479f-9b40-685b5a393a1a-kube-api-access-lldbp\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.708877 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.708793 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:17.839643 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.839611 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 22:37:17.841351 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:37:17.841322 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod792e8531_e78c_479f_9b40_685b5a393a1a.slice/crio-b557e688f1be71cd7941372d04420683bac53820e4eaf27cb8f9fcaa7cb376c4 WatchSource:0}: Error finding container b557e688f1be71cd7941372d04420683bac53820e4eaf27cb8f9fcaa7cb376c4: Status 404 returned error can't find the container with id b557e688f1be71cd7941372d04420683bac53820e4eaf27cb8f9fcaa7cb376c4 Apr 16 22:37:17.843618 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:17.843596 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:37:18.525570 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:18.525481 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"792e8531-e78c-479f-9b40-685b5a393a1a","Type":"ContainerStarted","Data":"10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c"} Apr 16 22:37:18.525570 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:18.525522 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"792e8531-e78c-479f-9b40-685b5a393a1a","Type":"ContainerStarted","Data":"b557e688f1be71cd7941372d04420683bac53820e4eaf27cb8f9fcaa7cb376c4"} Apr 16 22:37:21.421296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:21.421217 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8001/health\": dial tcp 10.134.0.40:8001: connect: connection refused" Apr 16 22:37:21.427588 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:21.427532 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:37:22.545252 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:22.545156 2572 generic.go:358] "Generic (PLEG): container finished" podID="792e8531-e78c-479f-9b40-685b5a393a1a" containerID="10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c" exitCode=0 Apr 16 22:37:22.545252 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:22.545231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"792e8531-e78c-479f-9b40-685b5a393a1a","Type":"ContainerDied","Data":"10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c"} Apr 16 22:37:23.551399 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:23.551344 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"792e8531-e78c-479f-9b40-685b5a393a1a","Type":"ContainerStarted","Data":"d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda"} Apr 16 22:37:23.570493 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:23.570440 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.570424844 podStartE2EDuration="6.570424844s" podCreationTimestamp="2026-04-16 22:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:37:23.568557849 +0000 UTC m=+1423.793724230" watchObservedRunningTime="2026-04-16 22:37:23.570424844 +0000 UTC m=+1423.795591224" Apr 16 22:37:27.709202 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:27.709160 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:27.711030 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:27.710987 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:37:31.428243 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:31.428195 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 16 22:37:31.438807 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:31.438776 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:37:31.458767 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:31.458720 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:37:37.710091 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:37.710049 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:37:40.917411 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:40.917386 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4_ba077ded-2e20-4109-9bf6-f611e7ce1166/main/0.log" Apr 16 22:37:40.917767 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:40.917752 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:37:41.078661 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.078626 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-home\") pod \"ba077ded-2e20-4109-9bf6-f611e7ce1166\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " Apr 16 22:37:41.078849 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.078735 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-dshm\") pod \"ba077ded-2e20-4109-9bf6-f611e7ce1166\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " Apr 16 22:37:41.078849 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.078790 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-model-cache\") pod \"ba077ded-2e20-4109-9bf6-f611e7ce1166\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " Apr 16 22:37:41.078849 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.078818 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-kserve-provision-location\") pod \"ba077ded-2e20-4109-9bf6-f611e7ce1166\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " Apr 16 22:37:41.079022 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.078865 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba077ded-2e20-4109-9bf6-f611e7ce1166-tls-certs\") pod \"ba077ded-2e20-4109-9bf6-f611e7ce1166\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " Apr 16 22:37:41.079022 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.078906 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64rx9\" (UniqueName: \"kubernetes.io/projected/ba077ded-2e20-4109-9bf6-f611e7ce1166-kube-api-access-64rx9\") pod \"ba077ded-2e20-4109-9bf6-f611e7ce1166\" (UID: \"ba077ded-2e20-4109-9bf6-f611e7ce1166\") " Apr 16 22:37:41.079642 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.079618 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-model-cache" (OuterVolumeSpecName: "model-cache") pod "ba077ded-2e20-4109-9bf6-f611e7ce1166" (UID: "ba077ded-2e20-4109-9bf6-f611e7ce1166"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:41.079975 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.079869 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-home" (OuterVolumeSpecName: "home") pod "ba077ded-2e20-4109-9bf6-f611e7ce1166" (UID: "ba077ded-2e20-4109-9bf6-f611e7ce1166"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:41.093453 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.093386 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-dshm" (OuterVolumeSpecName: "dshm") pod "ba077ded-2e20-4109-9bf6-f611e7ce1166" (UID: "ba077ded-2e20-4109-9bf6-f611e7ce1166"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:41.097565 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.097519 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba077ded-2e20-4109-9bf6-f611e7ce1166-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ba077ded-2e20-4109-9bf6-f611e7ce1166" (UID: "ba077ded-2e20-4109-9bf6-f611e7ce1166"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:37:41.099676 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.099649 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba077ded-2e20-4109-9bf6-f611e7ce1166-kube-api-access-64rx9" (OuterVolumeSpecName: "kube-api-access-64rx9") pod "ba077ded-2e20-4109-9bf6-f611e7ce1166" (UID: "ba077ded-2e20-4109-9bf6-f611e7ce1166"). InnerVolumeSpecName "kube-api-access-64rx9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:37:41.173780 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.173733 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ba077ded-2e20-4109-9bf6-f611e7ce1166" (UID: "ba077ded-2e20-4109-9bf6-f611e7ce1166"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:41.179972 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.179945 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:37:41.179972 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.179974 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:37:41.180145 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.179987 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:37:41.180145 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.180003 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba077ded-2e20-4109-9bf6-f611e7ce1166-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:37:41.180145 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.180015 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba077ded-2e20-4109-9bf6-f611e7ce1166-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:37:41.180145 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.180027 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-64rx9\" (UniqueName: \"kubernetes.io/projected/ba077ded-2e20-4109-9bf6-f611e7ce1166-kube-api-access-64rx9\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:37:41.438954 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.438862 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:37:41.447533 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.447504 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:37:41.622045 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.622015 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4_ba077ded-2e20-4109-9bf6-f611e7ce1166/main/0.log" Apr 16 22:37:41.622410 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.622379 2572 generic.go:358] "Generic (PLEG): container finished" podID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerID="2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3" exitCode=137 Apr 16 22:37:41.622527 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.622443 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" Apr 16 22:37:41.622527 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.622463 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" event={"ID":"ba077ded-2e20-4109-9bf6-f611e7ce1166","Type":"ContainerDied","Data":"2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3"} Apr 16 22:37:41.622527 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.622506 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4" event={"ID":"ba077ded-2e20-4109-9bf6-f611e7ce1166","Type":"ContainerDied","Data":"babdc1f12128aa9789b6d85efbef6cf5ed8c8c01536c776d700e6abbd512afeb"} Apr 16 22:37:41.622679 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.622528 2572 scope.go:117] "RemoveContainer" containerID="2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3" Apr 16 22:37:41.643969 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.643775 2572 scope.go:117] "RemoveContainer" containerID="6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657" Apr 16 22:37:41.647073 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.647046 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4"] Apr 16 22:37:41.651715 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.651689 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-54dcd47b56lvrl4"] Apr 16 22:37:41.708699 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.708536 2572 scope.go:117] "RemoveContainer" containerID="2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3" Apr 16 22:37:41.708895 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:37:41.708872 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3\": container with ID starting with 2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3 not found: ID does not exist" containerID="2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3" Apr 16 22:37:41.708943 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.708908 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3"} err="failed to get container status \"2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3\": rpc error: code = NotFound desc = could not find container \"2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3\": container with ID starting with 2b808038c243aa80cd4f92a2000f9823db2d3ff3b3a1a446103254a6e129aef3 not found: ID does not exist" Apr 16 22:37:41.708943 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.708934 2572 scope.go:117] "RemoveContainer" containerID="6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657" Apr 16 22:37:41.709227 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:37:41.709203 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657\": container with ID starting with 6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657 not found: ID does not exist" containerID="6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657" Apr 16 22:37:41.709301 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:41.709233 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657"} err="failed to get container status \"6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657\": rpc error: code = NotFound desc = could not find container \"6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657\": container with ID starting with 6dd7af3bc9bfe1c23b185aab5dbd944edbda46752a0852d3967ecc0154bfb657 not found: ID does not exist" Apr 16 22:37:42.447454 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:42.447410 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" path="/var/lib/kubelet/pods/ba077ded-2e20-4109-9bf6-f611e7ce1166/volumes" Apr 16 22:37:47.708995 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:47.708957 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:37:47.709534 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:47.709308 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:37:53.307396 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:53.307361 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9"] Apr 16 22:37:53.307764 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:53.307711 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" containerID="cri-o://e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644" gracePeriod=30 Apr 16 22:37:53.310810 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:53.310784 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs"] Apr 16 22:37:53.311258 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:53.311226 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" containerID="cri-o://ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a" gracePeriod=30 Apr 16 22:37:57.709534 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:37:57.709490 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:38:07.709259 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:07.709215 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:38:08.839006 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.838973 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk"] Apr 16 22:38:08.839409 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.839395 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="storage-initializer" Apr 16 22:38:08.839456 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.839413 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="storage-initializer" Apr 16 22:38:08.839456 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.839425 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" Apr 16 22:38:08.839456 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.839430 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" Apr 16 22:38:08.839546 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.839496 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba077ded-2e20-4109-9bf6-f611e7ce1166" containerName="main" Apr 16 22:38:08.843013 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.842992 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:08.845805 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.845780 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 22:38:08.862372 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.862329 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk"] Apr 16 22:38:08.923277 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.923206 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxng\" (UniqueName: \"kubernetes.io/projected/21c668be-ecde-41df-b6aa-473fbafeed94-kube-api-access-vpxng\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:08.923277 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.923249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21c668be-ecde-41df-b6aa-473fbafeed94-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:08.923511 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.923347 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:08.923511 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.923399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:08.923511 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.923431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:08.923511 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:08.923456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.024588 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.024548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.024756 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.024606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.024756 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.024632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.024756 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.024663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.024756 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.024727 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxng\" (UniqueName: \"kubernetes.io/projected/21c668be-ecde-41df-b6aa-473fbafeed94-kube-api-access-vpxng\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.024965 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.024761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21c668be-ecde-41df-b6aa-473fbafeed94-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.025077 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.025052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.025175 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.025061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.025175 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.025094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.027192 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.027160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.027293 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.027277 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21c668be-ecde-41df-b6aa-473fbafeed94-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.032455 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.032432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxng\" (UniqueName: \"kubernetes.io/projected/21c668be-ecde-41df-b6aa-473fbafeed94-kube-api-access-vpxng\") pod \"custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.158376 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.158303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:09.300643 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.300609 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk"] Apr 16 22:38:09.301905 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:38:09.301878 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21c668be_ecde_41df_b6aa_473fbafeed94.slice/crio-76c0bac5e8de8b618ba354f9c9eaffe78e55aab402d23dba2c65cde489b45fd5 WatchSource:0}: Error finding container 76c0bac5e8de8b618ba354f9c9eaffe78e55aab402d23dba2c65cde489b45fd5: Status 404 returned error can't find the container with id 76c0bac5e8de8b618ba354f9c9eaffe78e55aab402d23dba2c65cde489b45fd5 Apr 16 22:38:09.737621 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.737581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" event={"ID":"21c668be-ecde-41df-b6aa-473fbafeed94","Type":"ContainerStarted","Data":"eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21"} Apr 16 22:38:09.737621 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:09.737626 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" event={"ID":"21c668be-ecde-41df-b6aa-473fbafeed94","Type":"ContainerStarted","Data":"76c0bac5e8de8b618ba354f9c9eaffe78e55aab402d23dba2c65cde489b45fd5"} Apr 16 22:38:13.757841 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:13.757805 2572 generic.go:358] "Generic (PLEG): container finished" podID="21c668be-ecde-41df-b6aa-473fbafeed94" containerID="eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21" exitCode=0 Apr 16 22:38:13.758276 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:13.757868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" event={"ID":"21c668be-ecde-41df-b6aa-473fbafeed94","Type":"ContainerDied","Data":"eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21"} Apr 16 22:38:14.763963 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:14.763922 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" event={"ID":"21c668be-ecde-41df-b6aa-473fbafeed94","Type":"ContainerStarted","Data":"d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873"} Apr 16 22:38:14.786452 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:14.786370 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podStartSLOduration=6.786351725 podStartE2EDuration="6.786351725s" podCreationTimestamp="2026-04-16 22:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:38:14.78406276 +0000 UTC m=+1475.009229154" watchObservedRunningTime="2026-04-16 22:38:14.786351725 +0000 UTC m=+1475.011518109" Apr 16 22:38:17.710258 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:17.710207 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:38:19.158951 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:19.158916 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:19.159353 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:19.159253 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:38:19.160728 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:19.160699 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:38:23.312006 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.311908 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="llm-d-routing-sidecar" containerID="cri-o://d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1" gracePeriod=2 Apr 16 22:38:23.608274 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.608246 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs_1e377d24-a06e-4cfa-b31f-f7cb1731b151/main/0.log" Apr 16 22:38:23.608981 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.608962 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:38:23.669596 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.669561 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-dshm\") pod \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " Apr 16 22:38:23.669753 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.669627 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-home\") pod \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " Apr 16 22:38:23.669753 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.669667 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw6l2\" (UniqueName: \"kubernetes.io/projected/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kube-api-access-qw6l2\") pod \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " Apr 16 22:38:23.669753 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.669708 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-model-cache\") pod \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " Apr 16 22:38:23.669753 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.669749 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kserve-provision-location\") pod \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " Apr 16 22:38:23.670719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.670301 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e377d24-a06e-4cfa-b31f-f7cb1731b151-tls-certs\") pod \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\" (UID: \"1e377d24-a06e-4cfa-b31f-f7cb1731b151\") " Apr 16 22:38:23.670719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.670333 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-model-cache" (OuterVolumeSpecName: "model-cache") pod "1e377d24-a06e-4cfa-b31f-f7cb1731b151" (UID: "1e377d24-a06e-4cfa-b31f-f7cb1731b151"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:23.670719 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.670675 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.671165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.671128 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-home" (OuterVolumeSpecName: "home") pod "1e377d24-a06e-4cfa-b31f-f7cb1731b151" (UID: "1e377d24-a06e-4cfa-b31f-f7cb1731b151"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:23.673165 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.673127 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-dshm" (OuterVolumeSpecName: "dshm") pod "1e377d24-a06e-4cfa-b31f-f7cb1731b151" (UID: "1e377d24-a06e-4cfa-b31f-f7cb1731b151"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:23.673281 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.673202 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kube-api-access-qw6l2" (OuterVolumeSpecName: "kube-api-access-qw6l2") pod "1e377d24-a06e-4cfa-b31f-f7cb1731b151" (UID: "1e377d24-a06e-4cfa-b31f-f7cb1731b151"). InnerVolumeSpecName "kube-api-access-qw6l2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:38:23.674522 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.674501 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e377d24-a06e-4cfa-b31f-f7cb1731b151-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1e377d24-a06e-4cfa-b31f-f7cb1731b151" (UID: "1e377d24-a06e-4cfa-b31f-f7cb1731b151"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:38:23.728873 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.728827 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1e377d24-a06e-4cfa-b31f-f7cb1731b151" (UID: "1e377d24-a06e-4cfa-b31f-f7cb1731b151"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:23.746409 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.746380 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:38:23.771654 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.771624 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qw6l2\" (UniqueName: \"kubernetes.io/projected/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kube-api-access-qw6l2\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.771654 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.771655 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.771857 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.771669 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1e377d24-a06e-4cfa-b31f-f7cb1731b151-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.771857 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.771685 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.771857 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.771697 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1e377d24-a06e-4cfa-b31f-f7cb1731b151-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.799410 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.799382 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs_1e377d24-a06e-4cfa-b31f-f7cb1731b151/main/0.log" Apr 16 22:38:23.800269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.800175 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerID="ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a" exitCode=137 Apr 16 22:38:23.800269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.800202 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerID="d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1" exitCode=0 Apr 16 22:38:23.800464 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.800277 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" Apr 16 22:38:23.800464 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.800276 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" event={"ID":"1e377d24-a06e-4cfa-b31f-f7cb1731b151","Type":"ContainerDied","Data":"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a"} Apr 16 22:38:23.800464 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.800327 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" event={"ID":"1e377d24-a06e-4cfa-b31f-f7cb1731b151","Type":"ContainerDied","Data":"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1"} Apr 16 22:38:23.800464 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.800344 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs" event={"ID":"1e377d24-a06e-4cfa-b31f-f7cb1731b151","Type":"ContainerDied","Data":"41e58de175784e9b4d8f82fb36bad01220e3eacd7fa5932ef765770d663c1045"} Apr 16 22:38:23.800464 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.800364 2572 scope.go:117] "RemoveContainer" containerID="ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a" Apr 16 22:38:23.802254 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.802188 2572 generic.go:358] "Generic (PLEG): container finished" podID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerID="e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644" exitCode=137 Apr 16 22:38:23.802254 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.802228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" event={"ID":"a9047af0-fa0c-4323-80c4-7e1bdd8cc483","Type":"ContainerDied","Data":"e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644"} Apr 16 22:38:23.802421 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.802258 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" Apr 16 22:38:23.802421 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.802259 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9" event={"ID":"a9047af0-fa0c-4323-80c4-7e1bdd8cc483","Type":"ContainerDied","Data":"5ea8d8dd951120a4543e74ee4919036858fde041745bfe05bec702fe022f857c"} Apr 16 22:38:23.832233 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.832203 2572 scope.go:117] "RemoveContainer" containerID="f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a" Apr 16 22:38:23.836118 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.836076 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs"] Apr 16 22:38:23.839043 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.839020 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-55fb97d947d58vs"] Apr 16 22:38:23.872028 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.871980 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfrvw\" (UniqueName: \"kubernetes.io/projected/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kube-api-access-xfrvw\") pod \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " Apr 16 22:38:23.872266 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.872068 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-home\") pod \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " Apr 16 22:38:23.872266 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.872125 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-model-cache\") pod \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " Apr 16 22:38:23.872266 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.872171 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kserve-provision-location\") pod \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " Apr 16 22:38:23.872266 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.872208 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-dshm\") pod \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " Apr 16 22:38:23.872506 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.872269 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-tls-certs\") pod \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\" (UID: \"a9047af0-fa0c-4323-80c4-7e1bdd8cc483\") " Apr 16 22:38:23.874361 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.873138 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-model-cache" (OuterVolumeSpecName: "model-cache") pod "a9047af0-fa0c-4323-80c4-7e1bdd8cc483" (UID: "a9047af0-fa0c-4323-80c4-7e1bdd8cc483"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:23.874361 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.873293 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-home" (OuterVolumeSpecName: "home") pod "a9047af0-fa0c-4323-80c4-7e1bdd8cc483" (UID: "a9047af0-fa0c-4323-80c4-7e1bdd8cc483"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:23.875460 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.874732 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kube-api-access-xfrvw" (OuterVolumeSpecName: "kube-api-access-xfrvw") pod "a9047af0-fa0c-4323-80c4-7e1bdd8cc483" (UID: "a9047af0-fa0c-4323-80c4-7e1bdd8cc483"). InnerVolumeSpecName "kube-api-access-xfrvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:38:23.875460 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.875042 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a9047af0-fa0c-4323-80c4-7e1bdd8cc483" (UID: "a9047af0-fa0c-4323-80c4-7e1bdd8cc483"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:38:23.875635 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.875573 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-dshm" (OuterVolumeSpecName: "dshm") pod "a9047af0-fa0c-4323-80c4-7e1bdd8cc483" (UID: "a9047af0-fa0c-4323-80c4-7e1bdd8cc483"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:23.906424 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.906398 2572 scope.go:117] "RemoveContainer" containerID="d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1" Apr 16 22:38:23.915968 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.915933 2572 scope.go:117] "RemoveContainer" containerID="ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a" Apr 16 22:38:23.916371 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:38:23.916344 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a\": container with ID starting with ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a not found: ID does not exist" containerID="ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a" Apr 16 22:38:23.916483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.916382 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a"} err="failed to get container status \"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a\": rpc error: code = NotFound desc = could not find container \"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a\": container with ID starting with ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a not found: ID does not exist" Apr 16 22:38:23.916483 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.916400 2572 scope.go:117] "RemoveContainer" containerID="f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a" Apr 16 22:38:23.916709 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:38:23.916687 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a\": container with ID starting with f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a not found: ID does not exist" containerID="f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a" Apr 16 22:38:23.916770 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.916714 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a"} err="failed to get container status \"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a\": rpc error: code = NotFound desc = could not find container \"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a\": container with ID starting with f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a not found: ID does not exist" Apr 16 22:38:23.916770 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.916729 2572 scope.go:117] "RemoveContainer" containerID="d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1" Apr 16 22:38:23.916995 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:38:23.916970 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1\": container with ID starting with d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1 not found: ID does not exist" containerID="d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1" Apr 16 22:38:23.917056 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.916997 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1"} err="failed to get container status \"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1\": rpc error: code = NotFound desc = could not find container \"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1\": container with ID starting with d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1 not found: ID does not exist" Apr 16 22:38:23.917056 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.917010 2572 scope.go:117] "RemoveContainer" containerID="ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a" Apr 16 22:38:23.917387 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.917360 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a"} err="failed to get container status \"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a\": rpc error: code = NotFound desc = could not find container \"ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a\": container with ID starting with ba75e66dd5bc3ed12ff0ab0fb4e08419170e7fbcdd4ac66b91ab6f70ec7dcc7a not found: ID does not exist" Apr 16 22:38:23.917474 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.917391 2572 scope.go:117] "RemoveContainer" containerID="f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a" Apr 16 22:38:23.917706 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.917678 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a"} err="failed to get container status \"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a\": rpc error: code = NotFound desc = could not find container \"f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a\": container with ID starting with f01d56fd15396424656b58c68071654bdfcac5afb7dacdad4eb67cb6e8415d0a not found: ID does not exist" Apr 16 22:38:23.917804 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.917708 2572 scope.go:117] "RemoveContainer" containerID="d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1" Apr 16 22:38:23.918022 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.917993 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1"} err="failed to get container status \"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1\": rpc error: code = NotFound desc = could not find container \"d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1\": container with ID starting with d066a7387363227a3d6935385d8451a49122c8b4d1f370091e8272320f4985c1 not found: ID does not exist" Apr 16 22:38:23.918022 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.918018 2572 scope.go:117] "RemoveContainer" containerID="e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644" Apr 16 22:38:23.934119 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.934082 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a9047af0-fa0c-4323-80c4-7e1bdd8cc483" (UID: "a9047af0-fa0c-4323-80c4-7e1bdd8cc483"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:38:23.948358 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.948337 2572 scope.go:117] "RemoveContainer" containerID="65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb" Apr 16 22:38:23.973765 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.973741 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.973765 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.973768 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.973926 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.973784 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.973926 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.973798 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.973926 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.973810 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfrvw\" (UniqueName: \"kubernetes.io/projected/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-kube-api-access-xfrvw\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:23.973926 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:23.973822 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a9047af0-fa0c-4323-80c4-7e1bdd8cc483-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:38:24.020324 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:24.020301 2572 scope.go:117] "RemoveContainer" containerID="e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644" Apr 16 22:38:24.020679 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:38:24.020658 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644\": container with ID starting with e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644 not found: ID does not exist" containerID="e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644" Apr 16 22:38:24.020746 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:24.020687 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644"} err="failed to get container status \"e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644\": rpc error: code = NotFound desc = could not find container \"e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644\": container with ID starting with e705ea64af3f7098bffb6cdf0bb58793eaa2e326947f414d77b1c8f242d2a644 not found: ID does not exist" Apr 16 22:38:24.020746 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:24.020706 2572 scope.go:117] "RemoveContainer" containerID="65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb" Apr 16 22:38:24.021010 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:38:24.020982 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb\": container with ID starting with 65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb not found: ID does not exist" containerID="65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb" Apr 16 22:38:24.021348 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:24.021018 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb"} err="failed to get container status \"65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb\": rpc error: code = NotFound desc = could not find container \"65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb\": container with ID starting with 65070b0f7ae81a2634dee3e21664a48714c23b9a869c15665eac456e6b1854bb not found: ID does not exist" Apr 16 22:38:24.125304 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:24.125264 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9"] Apr 16 22:38:24.128116 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:24.128077 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-7dzrrt9"] Apr 16 22:38:24.445318 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:24.445286 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" path="/var/lib/kubelet/pods/1e377d24-a06e-4cfa-b31f-f7cb1731b151/volumes" Apr 16 22:38:24.445827 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:24.445793 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" path="/var/lib/kubelet/pods/a9047af0-fa0c-4323-80c4-7e1bdd8cc483/volumes" Apr 16 22:38:27.710305 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:27.710254 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:38:29.159622 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:29.159577 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:38:37.709436 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:37.709396 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:38:39.159208 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:39.159157 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:38:40.450122 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:40.450078 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:38:40.451824 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:40.451800 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:38:47.709630 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:47.709562 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 22:38:49.158679 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:49.158586 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:38:57.718906 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:57.718875 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:38:57.726639 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:57.726606 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:38:59.159439 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:38:59.159365 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:39:06.379191 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:06.379150 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 22:39:06.379618 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:06.379533 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" containerID="cri-o://d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda" gracePeriod=30 Apr 16 22:39:07.516991 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.516967 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:39:07.588213 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588091 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-kserve-provision-location\") pod \"792e8531-e78c-479f-9b40-685b5a393a1a\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " Apr 16 22:39:07.588388 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588212 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-home\") pod \"792e8531-e78c-479f-9b40-685b5a393a1a\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " Apr 16 22:39:07.588388 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588242 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/792e8531-e78c-479f-9b40-685b5a393a1a-tls-certs\") pod \"792e8531-e78c-479f-9b40-685b5a393a1a\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " Apr 16 22:39:07.588388 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588273 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-model-cache\") pod \"792e8531-e78c-479f-9b40-685b5a393a1a\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " Apr 16 22:39:07.588388 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588298 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-dshm\") pod \"792e8531-e78c-479f-9b40-685b5a393a1a\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " Apr 16 22:39:07.588388 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588357 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lldbp\" (UniqueName: \"kubernetes.io/projected/792e8531-e78c-479f-9b40-685b5a393a1a-kube-api-access-lldbp\") pod \"792e8531-e78c-479f-9b40-685b5a393a1a\" (UID: \"792e8531-e78c-479f-9b40-685b5a393a1a\") " Apr 16 22:39:07.588664 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588488 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-model-cache" (OuterVolumeSpecName: "model-cache") pod "792e8531-e78c-479f-9b40-685b5a393a1a" (UID: "792e8531-e78c-479f-9b40-685b5a393a1a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:07.588722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588689 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-home" (OuterVolumeSpecName: "home") pod "792e8531-e78c-479f-9b40-685b5a393a1a" (UID: "792e8531-e78c-479f-9b40-685b5a393a1a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:07.588722 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.588715 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:07.590522 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.590496 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-dshm" (OuterVolumeSpecName: "dshm") pod "792e8531-e78c-479f-9b40-685b5a393a1a" (UID: "792e8531-e78c-479f-9b40-685b5a393a1a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:07.590627 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.590524 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/792e8531-e78c-479f-9b40-685b5a393a1a-kube-api-access-lldbp" (OuterVolumeSpecName: "kube-api-access-lldbp") pod "792e8531-e78c-479f-9b40-685b5a393a1a" (UID: "792e8531-e78c-479f-9b40-685b5a393a1a"). InnerVolumeSpecName "kube-api-access-lldbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:39:07.590697 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.590680 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/792e8531-e78c-479f-9b40-685b5a393a1a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "792e8531-e78c-479f-9b40-685b5a393a1a" (UID: "792e8531-e78c-479f-9b40-685b5a393a1a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:39:07.643607 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.643571 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "792e8531-e78c-479f-9b40-685b5a393a1a" (UID: "792e8531-e78c-479f-9b40-685b5a393a1a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:07.689154 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.689121 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:07.689154 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.689153 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:07.689329 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.689163 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/792e8531-e78c-479f-9b40-685b5a393a1a-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:07.689329 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.689172 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/792e8531-e78c-479f-9b40-685b5a393a1a-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:07.689329 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.689180 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lldbp\" (UniqueName: \"kubernetes.io/projected/792e8531-e78c-479f-9b40-685b5a393a1a-kube-api-access-lldbp\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:07.990067 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.989975 2572 generic.go:358] "Generic (PLEG): container finished" podID="792e8531-e78c-479f-9b40-685b5a393a1a" containerID="d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda" exitCode=0 Apr 16 22:39:07.990067 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.990021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"792e8531-e78c-479f-9b40-685b5a393a1a","Type":"ContainerDied","Data":"d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda"} Apr 16 22:39:07.990067 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.990054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"792e8531-e78c-479f-9b40-685b5a393a1a","Type":"ContainerDied","Data":"b557e688f1be71cd7941372d04420683bac53820e4eaf27cb8f9fcaa7cb376c4"} Apr 16 22:39:07.990067 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.990053 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 22:39:07.990434 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:07.990067 2572 scope.go:117] "RemoveContainer" containerID="d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda" Apr 16 22:39:08.011813 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:08.011777 2572 scope.go:117] "RemoveContainer" containerID="10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c" Apr 16 22:39:08.012461 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:08.012434 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 22:39:08.016209 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:08.016189 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 22:39:08.076607 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:08.076588 2572 scope.go:117] "RemoveContainer" containerID="d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda" Apr 16 22:39:08.076959 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:39:08.076939 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda\": container with ID starting with d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda not found: ID does not exist" containerID="d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda" Apr 16 22:39:08.077016 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:08.076972 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda"} err="failed to get container status \"d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda\": rpc error: code = NotFound desc = could not find container \"d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda\": container with ID starting with d4b6b0b0fad0dc99000bf15868ee1ee02e9cf7ca485749bb93b1b1df578b8cda not found: ID does not exist" Apr 16 22:39:08.077016 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:08.076994 2572 scope.go:117] "RemoveContainer" containerID="10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c" Apr 16 22:39:08.077330 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:39:08.077315 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c\": container with ID starting with 10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c not found: ID does not exist" containerID="10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c" Apr 16 22:39:08.077390 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:08.077334 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c"} err="failed to get container status \"10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c\": rpc error: code = NotFound desc = could not find container \"10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c\": container with ID starting with 10e31f0a16516ab2620265f622f5ecc6c0e39a09c10c4041cafc3fb3a70d6d9c not found: ID does not exist" Apr 16 22:39:08.452009 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:08.451971 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" path="/var/lib/kubelet/pods/792e8531-e78c-479f-9b40-685b5a393a1a/volumes" Apr 16 22:39:09.158744 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:09.158709 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:39:19.159416 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:19.159373 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:39:23.488776 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.488739 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b"] Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489298 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="llm-d-routing-sidecar" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489318 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="llm-d-routing-sidecar" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489340 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="storage-initializer" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489349 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="storage-initializer" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489364 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="storage-initializer" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489374 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="storage-initializer" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489387 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489396 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489415 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489422 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489440 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489448 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489466 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="storage-initializer" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489475 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="storage-initializer" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489562 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="main" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489574 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="792e8531-e78c-479f-9b40-685b5a393a1a" containerName="main" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489595 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e377d24-a06e-4cfa-b31f-f7cb1731b151" containerName="llm-d-routing-sidecar" Apr 16 22:39:23.491351 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.489607 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9047af0-fa0c-4323-80c4-7e1bdd8cc483" containerName="main" Apr 16 22:39:23.496787 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.496260 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.499400 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.499376 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 22:39:23.504368 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.504343 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b"] Apr 16 22:39:23.632847 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.632810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7br7b\" (UniqueName: \"kubernetes.io/projected/759d56de-2ae0-4564-9869-73f2f89c0f75-kube-api-access-7br7b\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.633010 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.632865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-home\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.633010 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.632934 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.633010 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.632976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-model-cache\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.633010 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.632999 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-dshm\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.633182 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.633030 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/759d56de-2ae0-4564-9869-73f2f89c0f75-tls-certs\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734225 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/759d56de-2ae0-4564-9869-73f2f89c0f75-tls-certs\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734355 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7br7b\" (UniqueName: \"kubernetes.io/projected/759d56de-2ae0-4564-9869-73f2f89c0f75-kube-api-access-7br7b\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734355 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-home\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734355 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734355 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-model-cache\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734355 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-dshm\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734776 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734898 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-model-cache\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.734898 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.734839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-home\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.736559 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.736537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-dshm\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.736720 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.736701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/759d56de-2ae0-4564-9869-73f2f89c0f75-tls-certs\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.742531 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.742482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7br7b\" (UniqueName: \"kubernetes.io/projected/759d56de-2ae0-4564-9869-73f2f89c0f75-kube-api-access-7br7b\") pod \"scheduler-inline-config-test-kserve-8dc7cd575-t2q7b\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.810590 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.810555 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:23.939663 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:23.939636 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b"] Apr 16 22:39:23.941124 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:39:23.941077 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod759d56de_2ae0_4564_9869_73f2f89c0f75.slice/crio-c5ed72676cbe9309402f66886b490359ca098eef622494838fa1cc74dc341d95 WatchSource:0}: Error finding container c5ed72676cbe9309402f66886b490359ca098eef622494838fa1cc74dc341d95: Status 404 returned error can't find the container with id c5ed72676cbe9309402f66886b490359ca098eef622494838fa1cc74dc341d95 Apr 16 22:39:24.049092 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:24.049053 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" event={"ID":"759d56de-2ae0-4564-9869-73f2f89c0f75","Type":"ContainerStarted","Data":"3d67f2c7b40dfcf8e97e2aff237b2821116fb103cb665ca60403fc1f9885b4f9"} Apr 16 22:39:24.049301 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:24.049096 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" event={"ID":"759d56de-2ae0-4564-9869-73f2f89c0f75","Type":"ContainerStarted","Data":"c5ed72676cbe9309402f66886b490359ca098eef622494838fa1cc74dc341d95"} Apr 16 22:39:29.070061 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:29.070026 2572 generic.go:358] "Generic (PLEG): container finished" podID="759d56de-2ae0-4564-9869-73f2f89c0f75" containerID="3d67f2c7b40dfcf8e97e2aff237b2821116fb103cb665ca60403fc1f9885b4f9" exitCode=0 Apr 16 22:39:29.070447 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:29.070119 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" event={"ID":"759d56de-2ae0-4564-9869-73f2f89c0f75","Type":"ContainerDied","Data":"3d67f2c7b40dfcf8e97e2aff237b2821116fb103cb665ca60403fc1f9885b4f9"} Apr 16 22:39:29.159481 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:29.159436 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:39:30.075434 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:30.075397 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" event={"ID":"759d56de-2ae0-4564-9869-73f2f89c0f75","Type":"ContainerStarted","Data":"b31598b0af367602119e07569e36b75c7ae05b06a0382b12822359508792ee27"} Apr 16 22:39:30.095531 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:30.095480 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" podStartSLOduration=7.095461806 podStartE2EDuration="7.095461806s" podCreationTimestamp="2026-04-16 22:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:39:30.093624068 +0000 UTC m=+1550.318790464" watchObservedRunningTime="2026-04-16 22:39:30.095461806 +0000 UTC m=+1550.320628187" Apr 16 22:39:33.811395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:33.811346 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:33.811818 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:33.811491 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:33.824239 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:33.824214 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:34.100922 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:34.100837 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:39.159151 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:39.159081 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 22:39:49.168922 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:49.168882 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:39:49.176709 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:49.176684 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:39:56.954727 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:56.954687 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b"] Apr 16 22:39:56.955627 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:56.955571 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" podUID="759d56de-2ae0-4564-9869-73f2f89c0f75" containerName="main" containerID="cri-o://b31598b0af367602119e07569e36b75c7ae05b06a0382b12822359508792ee27" gracePeriod=30 Apr 16 22:39:57.170916 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.170884 2572 generic.go:358] "Generic (PLEG): container finished" podID="759d56de-2ae0-4564-9869-73f2f89c0f75" containerID="b31598b0af367602119e07569e36b75c7ae05b06a0382b12822359508792ee27" exitCode=0 Apr 16 22:39:57.171101 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.170964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" event={"ID":"759d56de-2ae0-4564-9869-73f2f89c0f75","Type":"ContainerDied","Data":"b31598b0af367602119e07569e36b75c7ae05b06a0382b12822359508792ee27"} Apr 16 22:39:57.211239 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.211178 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:57.354966 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.354934 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-dshm\") pod \"759d56de-2ae0-4564-9869-73f2f89c0f75\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " Apr 16 22:39:57.354966 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.354971 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-model-cache\") pod \"759d56de-2ae0-4564-9869-73f2f89c0f75\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " Apr 16 22:39:57.355203 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.354996 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7br7b\" (UniqueName: \"kubernetes.io/projected/759d56de-2ae0-4564-9869-73f2f89c0f75-kube-api-access-7br7b\") pod \"759d56de-2ae0-4564-9869-73f2f89c0f75\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " Apr 16 22:39:57.355203 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.355021 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/759d56de-2ae0-4564-9869-73f2f89c0f75-tls-certs\") pod \"759d56de-2ae0-4564-9869-73f2f89c0f75\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " Apr 16 22:39:57.355203 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.355185 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-kserve-provision-location\") pod \"759d56de-2ae0-4564-9869-73f2f89c0f75\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " Apr 16 22:39:57.355354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.355223 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-home\") pod \"759d56de-2ae0-4564-9869-73f2f89c0f75\" (UID: \"759d56de-2ae0-4564-9869-73f2f89c0f75\") " Apr 16 22:39:57.355354 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.355273 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-model-cache" (OuterVolumeSpecName: "model-cache") pod "759d56de-2ae0-4564-9869-73f2f89c0f75" (UID: "759d56de-2ae0-4564-9869-73f2f89c0f75"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:57.355492 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.355467 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-home" (OuterVolumeSpecName: "home") pod "759d56de-2ae0-4564-9869-73f2f89c0f75" (UID: "759d56de-2ae0-4564-9869-73f2f89c0f75"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:57.355612 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.355561 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:57.355612 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.355578 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:57.357196 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.357174 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/759d56de-2ae0-4564-9869-73f2f89c0f75-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "759d56de-2ae0-4564-9869-73f2f89c0f75" (UID: "759d56de-2ae0-4564-9869-73f2f89c0f75"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:39:57.357520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.357493 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759d56de-2ae0-4564-9869-73f2f89c0f75-kube-api-access-7br7b" (OuterVolumeSpecName: "kube-api-access-7br7b") pod "759d56de-2ae0-4564-9869-73f2f89c0f75" (UID: "759d56de-2ae0-4564-9869-73f2f89c0f75"). InnerVolumeSpecName "kube-api-access-7br7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:39:57.357625 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.357558 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-dshm" (OuterVolumeSpecName: "dshm") pod "759d56de-2ae0-4564-9869-73f2f89c0f75" (UID: "759d56de-2ae0-4564-9869-73f2f89c0f75"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:57.421625 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.421570 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "759d56de-2ae0-4564-9869-73f2f89c0f75" (UID: "759d56de-2ae0-4564-9869-73f2f89c0f75"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:57.456940 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.456903 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:57.456940 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.456930 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/759d56de-2ae0-4564-9869-73f2f89c0f75-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:57.456940 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.456944 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7br7b\" (UniqueName: \"kubernetes.io/projected/759d56de-2ae0-4564-9869-73f2f89c0f75-kube-api-access-7br7b\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:57.457162 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:57.456955 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/759d56de-2ae0-4564-9869-73f2f89c0f75-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:39:58.175868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:58.175829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" event={"ID":"759d56de-2ae0-4564-9869-73f2f89c0f75","Type":"ContainerDied","Data":"c5ed72676cbe9309402f66886b490359ca098eef622494838fa1cc74dc341d95"} Apr 16 22:39:58.175868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:58.175866 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b" Apr 16 22:39:58.176327 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:58.175887 2572 scope.go:117] "RemoveContainer" containerID="b31598b0af367602119e07569e36b75c7ae05b06a0382b12822359508792ee27" Apr 16 22:39:58.184777 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:58.184760 2572 scope.go:117] "RemoveContainer" containerID="3d67f2c7b40dfcf8e97e2aff237b2821116fb103cb665ca60403fc1f9885b4f9" Apr 16 22:39:58.197094 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:58.197072 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b"] Apr 16 22:39:58.198604 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:58.198584 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-8dc7cd575-t2q7b"] Apr 16 22:39:58.444488 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:39:58.444410 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759d56de-2ae0-4564-9869-73f2f89c0f75" path="/var/lib/kubelet/pods/759d56de-2ae0-4564-9869-73f2f89c0f75/volumes" Apr 16 22:40:11.003592 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.003556 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk"] Apr 16 22:40:11.004145 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.004098 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" containerID="cri-o://d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873" gracePeriod=30 Apr 16 22:40:11.048338 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.048301 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq"] Apr 16 22:40:11.048698 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.048684 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="759d56de-2ae0-4564-9869-73f2f89c0f75" containerName="storage-initializer" Apr 16 22:40:11.048698 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.048699 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="759d56de-2ae0-4564-9869-73f2f89c0f75" containerName="storage-initializer" Apr 16 22:40:11.048873 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.048707 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="759d56de-2ae0-4564-9869-73f2f89c0f75" containerName="main" Apr 16 22:40:11.048873 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.048713 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="759d56de-2ae0-4564-9869-73f2f89c0f75" containerName="main" Apr 16 22:40:11.048873 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.048779 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="759d56de-2ae0-4564-9869-73f2f89c0f75" containerName="main" Apr 16 22:40:11.053880 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.053854 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.056817 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.056794 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-j9rrb\"" Apr 16 22:40:11.056932 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.056888 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 22:40:11.065239 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.065214 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq"] Apr 16 22:40:11.169195 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7bl\" (UniqueName: \"kubernetes.io/projected/f350b007-8b12-4561-aa7c-9c4fe292da30-kube-api-access-lw7bl\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.169399 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.169399 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.169399 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.169399 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.169399 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.169571 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.169571 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.169571 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.169473 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f350b007-8b12-4561-aa7c-9c4fe292da30-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270691 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270612 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270691 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270895 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270895 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270709 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270895 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f350b007-8b12-4561-aa7c-9c4fe292da30-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270895 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7bl\" (UniqueName: \"kubernetes.io/projected/f350b007-8b12-4561-aa7c-9c4fe292da30-kube-api-access-lw7bl\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270895 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270895 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270833 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.270895 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.270868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.271258 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.271209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.271258 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.271234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.271379 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.271290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.271379 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.271304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.271906 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.271886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f350b007-8b12-4561-aa7c-9c4fe292da30-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.273080 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.273060 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.273260 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.273242 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.278558 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.278538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7bl\" (UniqueName: \"kubernetes.io/projected/f350b007-8b12-4561-aa7c-9c4fe292da30-kube-api-access-lw7bl\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.278899 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.278881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f350b007-8b12-4561-aa7c-9c4fe292da30-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6tjhq\" (UID: \"f350b007-8b12-4561-aa7c-9c4fe292da30\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.366200 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.366172 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:11.494724 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:11.494694 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq"] Apr 16 22:40:11.496582 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:40:11.496553 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf350b007_8b12_4561_aa7c_9c4fe292da30.slice/crio-8b55ac75799014db64a333ebd23c5de3c6411feebf2a290c70978738c1e4c812 WatchSource:0}: Error finding container 8b55ac75799014db64a333ebd23c5de3c6411feebf2a290c70978738c1e4c812: Status 404 returned error can't find the container with id 8b55ac75799014db64a333ebd23c5de3c6411feebf2a290c70978738c1e4c812 Apr 16 22:40:12.228641 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:12.228600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" event={"ID":"f350b007-8b12-4561-aa7c-9c4fe292da30","Type":"ContainerStarted","Data":"8b55ac75799014db64a333ebd23c5de3c6411feebf2a290c70978738c1e4c812"} Apr 16 22:40:13.716530 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:13.716487 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:40:13.716774 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:13.716592 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:40:13.716774 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:13.716630 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:40:14.236903 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:14.236863 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" event={"ID":"f350b007-8b12-4561-aa7c-9c4fe292da30","Type":"ContainerStarted","Data":"2aab879b355677dbb32fc1699e1faf4269df97aba7fc9ae7645b50d951f4f93a"} Apr 16 22:40:14.259465 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:14.259419 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" podStartSLOduration=1.04168916 podStartE2EDuration="3.259406892s" podCreationTimestamp="2026-04-16 22:40:11 +0000 UTC" firstStartedPulling="2026-04-16 22:40:11.49853181 +0000 UTC m=+1591.723698174" lastFinishedPulling="2026-04-16 22:40:13.716249539 +0000 UTC m=+1593.941415906" observedRunningTime="2026-04-16 22:40:14.2575981 +0000 UTC m=+1594.482764505" watchObservedRunningTime="2026-04-16 22:40:14.259406892 +0000 UTC m=+1594.484573272" Apr 16 22:40:14.366726 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:14.366694 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:14.368286 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:14.368253 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" podUID="f350b007-8b12-4561-aa7c-9c4fe292da30" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.45:15021/healthz/ready\": dial tcp 10.134.0.45:15021: connect: connection refused" Apr 16 22:40:15.366678 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:15.366632 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" podUID="f350b007-8b12-4561-aa7c-9c4fe292da30" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.45:15021/healthz/ready\": dial tcp 10.134.0.45:15021: connect: connection refused" Apr 16 22:40:16.367466 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:16.367421 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" podUID="f350b007-8b12-4561-aa7c-9c4fe292da30" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.45:15021/healthz/ready\": dial tcp 10.134.0.45:15021: connect: connection refused" Apr 16 22:40:17.370420 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:17.370391 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:17.370814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:17.370676 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:17.371331 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:17.371311 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6tjhq" Apr 16 22:40:24.812444 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.812407 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv"] Apr 16 22:40:24.816323 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.816307 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.818905 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.818884 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 22:40:24.819071 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.818980 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-wvv5k\"" Apr 16 22:40:24.826148 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.826124 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv"] Apr 16 22:40:24.893081 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.893053 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-home\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.893234 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.893126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-dshm\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.893234 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.893204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6428449d-012b-44d2-bd22-55a2c3f144a6-tls-certs\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.893313 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.893232 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkn6w\" (UniqueName: \"kubernetes.io/projected/6428449d-012b-44d2-bd22-55a2c3f144a6-kube-api-access-nkn6w\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.893313 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.893254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.893313 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.893276 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-model-cache\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994187 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-dshm\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994338 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994217 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6428449d-012b-44d2-bd22-55a2c3f144a6-tls-certs\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994338 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkn6w\" (UniqueName: \"kubernetes.io/projected/6428449d-012b-44d2-bd22-55a2c3f144a6-kube-api-access-nkn6w\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994338 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994338 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-model-cache\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994544 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-home\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994709 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994822 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-model-cache\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.994822 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.994772 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-home\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.996379 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.996357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-dshm\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:24.996635 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:24.996617 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6428449d-012b-44d2-bd22-55a2c3f144a6-tls-certs\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:25.001603 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:25.001572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkn6w\" (UniqueName: \"kubernetes.io/projected/6428449d-012b-44d2-bd22-55a2c3f144a6-kube-api-access-nkn6w\") pod \"router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:25.126391 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:25.126319 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:25.253558 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:25.253526 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv"] Apr 16 22:40:25.255420 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:40:25.255387 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6428449d_012b_44d2_bd22_55a2c3f144a6.slice/crio-d16d2d40e31dbaa1a73e66cf6706860246b19351942331c0f652efde68b87faf WatchSource:0}: Error finding container d16d2d40e31dbaa1a73e66cf6706860246b19351942331c0f652efde68b87faf: Status 404 returned error can't find the container with id d16d2d40e31dbaa1a73e66cf6706860246b19351942331c0f652efde68b87faf Apr 16 22:40:25.289183 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:25.289145 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" event={"ID":"6428449d-012b-44d2-bd22-55a2c3f144a6","Type":"ContainerStarted","Data":"d16d2d40e31dbaa1a73e66cf6706860246b19351942331c0f652efde68b87faf"} Apr 16 22:40:26.293865 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:26.293827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" event={"ID":"6428449d-012b-44d2-bd22-55a2c3f144a6","Type":"ContainerStarted","Data":"e350e6e7be43be9b4c727a7c03c277e124d41a829bcfa03353fff77788fa767f"} Apr 16 22:40:26.294266 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:26.293990 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:27.300305 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:27.300270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" event={"ID":"6428449d-012b-44d2-bd22-55a2c3f144a6","Type":"ContainerStarted","Data":"a8906bccf45f02a0ac8d0eb4d0d8789b27d6492c847b37a14cb3d26eba4d4436"} Apr 16 22:40:31.315460 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:31.315425 2572 generic.go:358] "Generic (PLEG): container finished" podID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerID="a8906bccf45f02a0ac8d0eb4d0d8789b27d6492c847b37a14cb3d26eba4d4436" exitCode=0 Apr 16 22:40:31.315825 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:31.315488 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" event={"ID":"6428449d-012b-44d2-bd22-55a2c3f144a6","Type":"ContainerDied","Data":"a8906bccf45f02a0ac8d0eb4d0d8789b27d6492c847b37a14cb3d26eba4d4436"} Apr 16 22:40:32.322026 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:32.321982 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" event={"ID":"6428449d-012b-44d2-bd22-55a2c3f144a6","Type":"ContainerStarted","Data":"ca8c89f8283e426e4e4ec551b0172a8a81f1657bf515966c5a47c4f3084b3ceb"} Apr 16 22:40:32.345756 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:32.345697 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podStartSLOduration=8.345682317 podStartE2EDuration="8.345682317s" podCreationTimestamp="2026-04-16 22:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:40:32.342948602 +0000 UTC m=+1612.568114983" watchObservedRunningTime="2026-04-16 22:40:32.345682317 +0000 UTC m=+1612.570848697" Apr 16 22:40:35.127289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:35.127241 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:35.127289 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:35.127296 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:35.128560 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:35.128521 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:40:35.147335 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:35.147310 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:40:41.295025 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.294999 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:40:41.361584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.361550 2572 generic.go:358] "Generic (PLEG): container finished" podID="21c668be-ecde-41df-b6aa-473fbafeed94" containerID="d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873" exitCode=137 Apr 16 22:40:41.361761 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.361621 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" Apr 16 22:40:41.361761 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.361626 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" event={"ID":"21c668be-ecde-41df-b6aa-473fbafeed94","Type":"ContainerDied","Data":"d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873"} Apr 16 22:40:41.361761 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.361667 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk" event={"ID":"21c668be-ecde-41df-b6aa-473fbafeed94","Type":"ContainerDied","Data":"76c0bac5e8de8b618ba354f9c9eaffe78e55aab402d23dba2c65cde489b45fd5"} Apr 16 22:40:41.361761 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.361683 2572 scope.go:117] "RemoveContainer" containerID="d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873" Apr 16 22:40:41.380915 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.380894 2572 scope.go:117] "RemoveContainer" containerID="eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21" Apr 16 22:40:41.447648 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.447611 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21c668be-ecde-41df-b6aa-473fbafeed94-tls-certs\") pod \"21c668be-ecde-41df-b6aa-473fbafeed94\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " Apr 16 22:40:41.447814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.447658 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-home\") pod \"21c668be-ecde-41df-b6aa-473fbafeed94\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " Apr 16 22:40:41.447814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.447708 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-dshm\") pod \"21c668be-ecde-41df-b6aa-473fbafeed94\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " Apr 16 22:40:41.447814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.447772 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpxng\" (UniqueName: \"kubernetes.io/projected/21c668be-ecde-41df-b6aa-473fbafeed94-kube-api-access-vpxng\") pod \"21c668be-ecde-41df-b6aa-473fbafeed94\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " Apr 16 22:40:41.447814 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.447805 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-kserve-provision-location\") pod \"21c668be-ecde-41df-b6aa-473fbafeed94\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " Apr 16 22:40:41.448055 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.447876 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-model-cache\") pod \"21c668be-ecde-41df-b6aa-473fbafeed94\" (UID: \"21c668be-ecde-41df-b6aa-473fbafeed94\") " Apr 16 22:40:41.448384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.448075 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-home" (OuterVolumeSpecName: "home") pod "21c668be-ecde-41df-b6aa-473fbafeed94" (UID: "21c668be-ecde-41df-b6aa-473fbafeed94"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:41.448384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.448231 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.448384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.448326 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-model-cache" (OuterVolumeSpecName: "model-cache") pod "21c668be-ecde-41df-b6aa-473fbafeed94" (UID: "21c668be-ecde-41df-b6aa-473fbafeed94"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:41.448656 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.448585 2572 scope.go:117] "RemoveContainer" containerID="d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873" Apr 16 22:40:41.449175 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:40:41.449006 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873\": container with ID starting with d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873 not found: ID does not exist" containerID="d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873" Apr 16 22:40:41.449175 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.449041 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873"} err="failed to get container status \"d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873\": rpc error: code = NotFound desc = could not find container \"d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873\": container with ID starting with d8c2287d88f4fc9afed126e1cfb080c184b71e26da56dd0057fb8a7dd7664873 not found: ID does not exist" Apr 16 22:40:41.449175 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.449068 2572 scope.go:117] "RemoveContainer" containerID="eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21" Apr 16 22:40:41.449463 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:40:41.449423 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21\": container with ID starting with eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21 not found: ID does not exist" containerID="eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21" Apr 16 22:40:41.449524 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.449458 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21"} err="failed to get container status \"eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21\": rpc error: code = NotFound desc = could not find container \"eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21\": container with ID starting with eca0b7ddbace4b2da93fab2a4e945c9df736c51160ec063470630ebc14c83b21 not found: ID does not exist" Apr 16 22:40:41.450485 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.450447 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c668be-ecde-41df-b6aa-473fbafeed94-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "21c668be-ecde-41df-b6aa-473fbafeed94" (UID: "21c668be-ecde-41df-b6aa-473fbafeed94"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:40:41.450747 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.450725 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c668be-ecde-41df-b6aa-473fbafeed94-kube-api-access-vpxng" (OuterVolumeSpecName: "kube-api-access-vpxng") pod "21c668be-ecde-41df-b6aa-473fbafeed94" (UID: "21c668be-ecde-41df-b6aa-473fbafeed94"). InnerVolumeSpecName "kube-api-access-vpxng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:40:41.450850 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.450828 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-dshm" (OuterVolumeSpecName: "dshm") pod "21c668be-ecde-41df-b6aa-473fbafeed94" (UID: "21c668be-ecde-41df-b6aa-473fbafeed94"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:41.516226 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.516182 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "21c668be-ecde-41df-b6aa-473fbafeed94" (UID: "21c668be-ecde-41df-b6aa-473fbafeed94"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:41.548898 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.548859 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.548898 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.548894 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpxng\" (UniqueName: \"kubernetes.io/projected/21c668be-ecde-41df-b6aa-473fbafeed94-kube-api-access-vpxng\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.549088 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.548911 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.549088 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.548925 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/21c668be-ecde-41df-b6aa-473fbafeed94-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.549088 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.548937 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/21c668be-ecde-41df-b6aa-473fbafeed94-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.684414 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.684381 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk"] Apr 16 22:40:41.687860 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:41.687832 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5f947c8d55-l4tkk"] Apr 16 22:40:42.447391 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:42.447340 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" path="/var/lib/kubelet/pods/21c668be-ecde-41df-b6aa-473fbafeed94/volumes" Apr 16 22:40:45.127066 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:45.127004 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:40:55.127146 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:40:55.127083 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:41:05.127579 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:41:05.127520 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:41:15.127753 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:41:15.127698 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:41:25.127168 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:41:25.127080 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:41:35.126988 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:41:35.126925 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:41:45.127195 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:41:45.127142 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:41:55.126952 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:41:55.126861 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 16 22:42:05.136499 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:05.136464 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:42:05.148291 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:05.148267 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:42:15.621969 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:15.621934 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv"] Apr 16 22:42:15.622494 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:15.622367 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" containerID="cri-o://ca8c89f8283e426e4e4ec551b0172a8a81f1657bf515966c5a47c4f3084b3ceb" gracePeriod=30 Apr 16 22:42:16.592167 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.592135 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jq6m8/must-gather-mdcxr"] Apr 16 22:42:16.592507 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.592495 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="storage-initializer" Apr 16 22:42:16.592552 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.592509 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="storage-initializer" Apr 16 22:42:16.592552 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.592523 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" Apr 16 22:42:16.592552 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.592528 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" Apr 16 22:42:16.592650 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.592595 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="21c668be-ecde-41df-b6aa-473fbafeed94" containerName="main" Apr 16 22:42:16.595872 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.595853 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:16.598841 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.598819 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jq6m8\"/\"openshift-service-ca.crt\"" Apr 16 22:42:16.599860 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.599842 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jq6m8\"/\"kube-root-ca.crt\"" Apr 16 22:42:16.599937 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.599842 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jq6m8\"/\"default-dockercfg-dtwfl\"" Apr 16 22:42:16.603379 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.603086 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jq6m8/must-gather-mdcxr"] Apr 16 22:42:16.626584 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.626555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-must-gather-output\") pod \"must-gather-mdcxr\" (UID: \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\") " pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:16.626899 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.626644 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29sbl\" (UniqueName: \"kubernetes.io/projected/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-kube-api-access-29sbl\") pod \"must-gather-mdcxr\" (UID: \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\") " pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:16.727510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.727474 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-must-gather-output\") pod \"must-gather-mdcxr\" (UID: \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\") " pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:16.727656 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.727583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29sbl\" (UniqueName: \"kubernetes.io/projected/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-kube-api-access-29sbl\") pod \"must-gather-mdcxr\" (UID: \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\") " pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:16.727825 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.727809 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-must-gather-output\") pod \"must-gather-mdcxr\" (UID: \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\") " pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:16.736461 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.736437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29sbl\" (UniqueName: \"kubernetes.io/projected/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-kube-api-access-29sbl\") pod \"must-gather-mdcxr\" (UID: \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\") " pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:16.906039 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:16.905957 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:17.032608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:17.031871 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jq6m8/must-gather-mdcxr"] Apr 16 22:42:17.033522 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:42:17.033492 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f77a1b4_e06b_498e_b816_bdfbf22db2b8.slice/crio-dbda88bb63a19fba3ea2f87d11e18308e90090210764be59e311dcedc10b7e6d WatchSource:0}: Error finding container dbda88bb63a19fba3ea2f87d11e18308e90090210764be59e311dcedc10b7e6d: Status 404 returned error can't find the container with id dbda88bb63a19fba3ea2f87d11e18308e90090210764be59e311dcedc10b7e6d Apr 16 22:42:17.708966 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:17.708928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" event={"ID":"6f77a1b4-e06b-498e-b816-bdfbf22db2b8","Type":"ContainerStarted","Data":"dbda88bb63a19fba3ea2f87d11e18308e90090210764be59e311dcedc10b7e6d"} Apr 16 22:42:21.817510 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:21.817486 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:42:22.732183 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:22.732140 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" event={"ID":"6f77a1b4-e06b-498e-b816-bdfbf22db2b8","Type":"ContainerStarted","Data":"d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb"} Apr 16 22:42:22.732183 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:22.732188 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" event={"ID":"6f77a1b4-e06b-498e-b816-bdfbf22db2b8","Type":"ContainerStarted","Data":"c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72"} Apr 16 22:42:22.750663 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:22.750597 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" podStartSLOduration=2.094603033 podStartE2EDuration="6.750577897s" podCreationTimestamp="2026-04-16 22:42:16 +0000 UTC" firstStartedPulling="2026-04-16 22:42:17.035564261 +0000 UTC m=+1717.260730620" lastFinishedPulling="2026-04-16 22:42:21.691539116 +0000 UTC m=+1721.916705484" observedRunningTime="2026-04-16 22:42:22.748354549 +0000 UTC m=+1722.973520932" watchObservedRunningTime="2026-04-16 22:42:22.750577897 +0000 UTC m=+1722.975744281" Apr 16 22:42:30.923941 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:30.923904 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:30.957555 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:30.957516 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:30.964151 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:30.964126 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:30.974860 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:30.974834 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:31.964675 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:31.964646 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:31.985044 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:31.985012 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:31.991415 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:31.991385 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:32.001034 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:32.001011 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:32.980932 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:32.980908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:33.002532 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:33.002503 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:33.009413 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:33.009392 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:33.021015 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:33.020993 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:33.979794 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:33.979759 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:34.002265 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:34.002238 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:34.008773 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:34.008753 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:34.019588 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:34.019564 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:34.971781 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:34.971752 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:34.993287 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:34.993250 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:34.999094 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:34.999072 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:35.009854 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:35.009836 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:35.937842 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:35.937804 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:35.957553 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:35.957518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:35.963461 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:35.963439 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:35.974935 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:35.974910 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:36.919926 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:36.919901 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:36.940749 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:36.940725 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:36.946523 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:36.946500 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:36.961571 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:36.961545 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:37.933256 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:37.933223 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:37.954971 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:37.954945 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:37.975907 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:37.975883 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:37.987352 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:37.987329 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:38.967303 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:38.967269 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:38.989565 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:38.989542 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:38.996395 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:38.996372 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:39.006859 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:39.006839 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:39.968139 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:39.968095 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:39.995968 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:39.995942 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:40.009468 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:40.009439 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:40.025006 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:40.024977 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:40.997413 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:40.997386 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:41.022864 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:41.022838 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:41.029687 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:41.029665 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:41.041098 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:41.041072 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:42.024794 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:42.024762 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:42.045403 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:42.045373 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:42.051621 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:42.051597 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:42.063023 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:42.062996 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:43.006426 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:43.006400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:43.028359 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:43.028329 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:43.034729 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:43.034694 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:43.045316 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:43.045296 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:43.991431 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:43.991398 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6tjhq_f350b007-8b12-4561-aa7c-9c4fe292da30/istio-proxy/0.log" Apr 16 22:42:44.011966 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:44.011935 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:44.017782 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:44.017757 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/llm-d-routing-sidecar/0.log" Apr 16 22:42:44.027497 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:44.027470 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/storage-initializer/0.log" Apr 16 22:42:45.016772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.016747 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c7c6f9bb8-k7dmc_5e1bbae3-73bb-42df-9209-225120eedbb0/router/0.log" Apr 16 22:42:45.623124 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.623018 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="llm-d-routing-sidecar" containerID="cri-o://e350e6e7be43be9b4c727a7c03c277e124d41a829bcfa03353fff77788fa767f" gracePeriod=2 Apr 16 22:42:45.816868 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.816842 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:45.817520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.817499 2572 generic.go:358] "Generic (PLEG): container finished" podID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerID="ca8c89f8283e426e4e4ec551b0172a8a81f1657bf515966c5a47c4f3084b3ceb" exitCode=137 Apr 16 22:42:45.817520 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.817519 2572 generic.go:358] "Generic (PLEG): container finished" podID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerID="e350e6e7be43be9b4c727a7c03c277e124d41a829bcfa03353fff77788fa767f" exitCode=0 Apr 16 22:42:45.817691 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.817570 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" event={"ID":"6428449d-012b-44d2-bd22-55a2c3f144a6","Type":"ContainerDied","Data":"ca8c89f8283e426e4e4ec551b0172a8a81f1657bf515966c5a47c4f3084b3ceb"} Apr 16 22:42:45.817691 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.817604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" event={"ID":"6428449d-012b-44d2-bd22-55a2c3f144a6","Type":"ContainerDied","Data":"e350e6e7be43be9b4c727a7c03c277e124d41a829bcfa03353fff77788fa767f"} Apr 16 22:42:45.853795 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.853771 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c7c6f9bb8-k7dmc_5e1bbae3-73bb-42df-9209-225120eedbb0/router/0.log" Apr 16 22:42:45.900556 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.900533 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:45.901275 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:45.901258 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:42:46.015553 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.015525 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-kserve-provision-location\") pod \"6428449d-012b-44d2-bd22-55a2c3f144a6\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " Apr 16 22:42:46.015739 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.015568 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-dshm\") pod \"6428449d-012b-44d2-bd22-55a2c3f144a6\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " Apr 16 22:42:46.015739 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.015642 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-home\") pod \"6428449d-012b-44d2-bd22-55a2c3f144a6\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " Apr 16 22:42:46.015739 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.015662 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6428449d-012b-44d2-bd22-55a2c3f144a6-tls-certs\") pod \"6428449d-012b-44d2-bd22-55a2c3f144a6\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " Apr 16 22:42:46.015739 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.015680 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkn6w\" (UniqueName: \"kubernetes.io/projected/6428449d-012b-44d2-bd22-55a2c3f144a6-kube-api-access-nkn6w\") pod \"6428449d-012b-44d2-bd22-55a2c3f144a6\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " Apr 16 22:42:46.015739 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.015705 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-model-cache\") pod \"6428449d-012b-44d2-bd22-55a2c3f144a6\" (UID: \"6428449d-012b-44d2-bd22-55a2c3f144a6\") " Apr 16 22:42:46.016007 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.015973 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-model-cache" (OuterVolumeSpecName: "model-cache") pod "6428449d-012b-44d2-bd22-55a2c3f144a6" (UID: "6428449d-012b-44d2-bd22-55a2c3f144a6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:46.016064 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.016006 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-home" (OuterVolumeSpecName: "home") pod "6428449d-012b-44d2-bd22-55a2c3f144a6" (UID: "6428449d-012b-44d2-bd22-55a2c3f144a6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:46.017801 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.017759 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6428449d-012b-44d2-bd22-55a2c3f144a6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6428449d-012b-44d2-bd22-55a2c3f144a6" (UID: "6428449d-012b-44d2-bd22-55a2c3f144a6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:42:46.018265 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.018238 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6428449d-012b-44d2-bd22-55a2c3f144a6-kube-api-access-nkn6w" (OuterVolumeSpecName: "kube-api-access-nkn6w") pod "6428449d-012b-44d2-bd22-55a2c3f144a6" (UID: "6428449d-012b-44d2-bd22-55a2c3f144a6"). InnerVolumeSpecName "kube-api-access-nkn6w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:42:46.018265 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.018259 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-dshm" (OuterVolumeSpecName: "dshm") pod "6428449d-012b-44d2-bd22-55a2c3f144a6" (UID: "6428449d-012b-44d2-bd22-55a2c3f144a6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:46.082665 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.082598 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6428449d-012b-44d2-bd22-55a2c3f144a6" (UID: "6428449d-012b-44d2-bd22-55a2c3f144a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:46.117036 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.116993 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-dshm\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.117036 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.117033 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-home\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.117334 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.117047 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6428449d-012b-44d2-bd22-55a2c3f144a6-tls-certs\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.117334 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.117067 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkn6w\" (UniqueName: \"kubernetes.io/projected/6428449d-012b-44d2-bd22-55a2c3f144a6-kube-api-access-nkn6w\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.117334 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.117080 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-model-cache\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.117334 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.117094 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6428449d-012b-44d2-bd22-55a2c3f144a6-kserve-provision-location\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.660624 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.660593 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-rzvrd_a2649eb6-0c6b-4152-8f3b-7cca119c92b2/kuadrant-console-plugin/0.log" Apr 16 22:42:46.698452 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.698414 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-kpr2g_891f6d15-dc70-46a5-b577-177289b5f30d/limitador/0.log" Apr 16 22:42:46.822729 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.822702 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv_6428449d-012b-44d2-bd22-55a2c3f144a6/main/0.log" Apr 16 22:42:46.823376 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.823346 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" event={"ID":"6428449d-012b-44d2-bd22-55a2c3f144a6","Type":"ContainerDied","Data":"d16d2d40e31dbaa1a73e66cf6706860246b19351942331c0f652efde68b87faf"} Apr 16 22:42:46.823500 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.823393 2572 scope.go:117] "RemoveContainer" containerID="ca8c89f8283e426e4e4ec551b0172a8a81f1657bf515966c5a47c4f3084b3ceb" Apr 16 22:42:46.823500 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.823435 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv" Apr 16 22:42:46.843491 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.843459 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv"] Apr 16 22:42:46.845944 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.845920 2572 scope.go:117] "RemoveContainer" containerID="a8906bccf45f02a0ac8d0eb4d0d8789b27d6492c847b37a14cb3d26eba4d4436" Apr 16 22:42:46.847456 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.847432 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7cb7ff74db-hx2tv"] Apr 16 22:42:46.917953 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:46.917926 2572 scope.go:117] "RemoveContainer" containerID="e350e6e7be43be9b4c727a7c03c277e124d41a829bcfa03353fff77788fa767f" Apr 16 22:42:47.828897 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:47.828864 2572 generic.go:358] "Generic (PLEG): container finished" podID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerID="c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72" exitCode=0 Apr 16 22:42:47.829384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:47.828937 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" event={"ID":"6f77a1b4-e06b-498e-b816-bdfbf22db2b8","Type":"ContainerDied","Data":"c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72"} Apr 16 22:42:47.829384 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:47.829358 2572 scope.go:117] "RemoveContainer" containerID="c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72" Apr 16 22:42:48.374269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:48.374237 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jq6m8_must-gather-mdcxr_6f77a1b4-e06b-498e-b816-bdfbf22db2b8/gather/0.log" Apr 16 22:42:48.444461 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:48.444428 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" path="/var/lib/kubelet/pods/6428449d-012b-44d2-bd22-55a2c3f144a6/volumes" Apr 16 22:42:51.753143 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:51.753093 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ktlhf_a800d5ed-9725-4dd0-a59c-eab57f01a3bc/global-pull-secret-syncer/0.log" Apr 16 22:42:51.844548 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:51.844513 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rkbz7_5f94db8f-97a3-4835-b7e4-7ef02819127d/konnectivity-agent/0.log" Apr 16 22:42:51.864711 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:51.864684 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-16.ec2.internal_1f3ee50f97b2910d9326d3a396c451df/haproxy/0.log" Apr 16 22:42:53.857227 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:53.857194 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-jq6m8/must-gather-mdcxr"] Apr 16 22:42:53.857604 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:53.857398 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerName="copy" containerID="cri-o://d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb" gracePeriod=2 Apr 16 22:42:53.863156 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:53.863133 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-jq6m8/must-gather-mdcxr"] Apr 16 22:42:54.091004 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.090981 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jq6m8_must-gather-mdcxr_6f77a1b4-e06b-498e-b816-bdfbf22db2b8/copy/0.log" Apr 16 22:42:54.091339 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.091323 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:54.093661 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.093636 2572 status_manager.go:895] "Failed to get status for pod" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" err="pods \"must-gather-mdcxr\" is forbidden: User \"system:node:ip-10-0-133-16.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-jq6m8\": no relationship found between node 'ip-10-0-133-16.ec2.internal' and this object" Apr 16 22:42:54.189929 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.189842 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29sbl\" (UniqueName: \"kubernetes.io/projected/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-kube-api-access-29sbl\") pod \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\" (UID: \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\") " Apr 16 22:42:54.189929 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.189909 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-must-gather-output\") pod \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\" (UID: \"6f77a1b4-e06b-498e-b816-bdfbf22db2b8\") " Apr 16 22:42:54.192091 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.192066 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-kube-api-access-29sbl" (OuterVolumeSpecName: "kube-api-access-29sbl") pod "6f77a1b4-e06b-498e-b816-bdfbf22db2b8" (UID: "6f77a1b4-e06b-498e-b816-bdfbf22db2b8"). InnerVolumeSpecName "kube-api-access-29sbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:42:54.195749 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.195727 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6f77a1b4-e06b-498e-b816-bdfbf22db2b8" (UID: "6f77a1b4-e06b-498e-b816-bdfbf22db2b8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:54.291633 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.291601 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29sbl\" (UniqueName: \"kubernetes.io/projected/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-kube-api-access-29sbl\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:42:54.291633 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.291629 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f77a1b4-e06b-498e-b816-bdfbf22db2b8-must-gather-output\") on node \"ip-10-0-133-16.ec2.internal\" DevicePath \"\"" Apr 16 22:42:54.445296 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.445217 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" path="/var/lib/kubelet/pods/6f77a1b4-e06b-498e-b816-bdfbf22db2b8/volumes" Apr 16 22:42:54.853146 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.853101 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-jq6m8_must-gather-mdcxr_6f77a1b4-e06b-498e-b816-bdfbf22db2b8/copy/0.log" Apr 16 22:42:54.853448 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.853419 2572 generic.go:358] "Generic (PLEG): container finished" podID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerID="d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb" exitCode=143 Apr 16 22:42:54.853563 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.853469 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jq6m8/must-gather-mdcxr" Apr 16 22:42:54.853563 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.853473 2572 scope.go:117] "RemoveContainer" containerID="d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb" Apr 16 22:42:54.861332 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.861199 2572 scope.go:117] "RemoveContainer" containerID="c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72" Apr 16 22:42:54.874745 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.874729 2572 scope.go:117] "RemoveContainer" containerID="d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb" Apr 16 22:42:54.874989 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:42:54.874970 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb\": container with ID starting with d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb not found: ID does not exist" containerID="d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb" Apr 16 22:42:54.875055 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.875005 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb"} err="failed to get container status \"d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb\": rpc error: code = NotFound desc = could not find container \"d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb\": container with ID starting with d1ea739c287e576cf86589a9aec7dfbbff6047a3570b1cae4f9105b8e86a87bb not found: ID does not exist" Apr 16 22:42:54.875055 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.875026 2572 scope.go:117] "RemoveContainer" containerID="c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72" Apr 16 22:42:54.875297 ip-10-0-133-16 kubenswrapper[2572]: E0416 22:42:54.875271 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72\": container with ID starting with c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72 not found: ID does not exist" containerID="c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72" Apr 16 22:42:54.875342 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:54.875308 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72"} err="failed to get container status \"c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72\": rpc error: code = NotFound desc = could not find container \"c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72\": container with ID starting with c5a056ff1952cd474f953c38dfdc96c1f7e285553113b9ee8cfcf2e4ff12ac72 not found: ID does not exist" Apr 16 22:42:55.779832 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:55.779805 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-rzvrd_a2649eb6-0c6b-4152-8f3b-7cca119c92b2/kuadrant-console-plugin/0.log" Apr 16 22:42:55.855492 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:55.855458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-kpr2g_891f6d15-dc70-46a5-b577-177289b5f30d/limitador/0.log" Apr 16 22:42:57.054795 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.054761 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-bjs2g_ee7387b2-e00d-45fd-881b-86e9cd70ab5b/cluster-monitoring-operator/0.log" Apr 16 22:42:57.148913 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.148883 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-67b66ff5c-b8v8q_239bdbcf-4bb5-4a84-b8ff-6bf0e532f445/metrics-server/0.log" Apr 16 22:42:57.278547 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.278523 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xlf7l_3b2bdd50-6512-49b4-bf64-f9c22a868568/node-exporter/0.log" Apr 16 22:42:57.299302 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.299280 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xlf7l_3b2bdd50-6512-49b4-bf64-f9c22a868568/kube-rbac-proxy/0.log" Apr 16 22:42:57.322187 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.322096 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xlf7l_3b2bdd50-6512-49b4-bf64-f9c22a868568/init-textfile/0.log" Apr 16 22:42:57.439468 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.439429 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-w9dhd_8ff4f32e-7232-4cba-8429-a746bb63a2a3/kube-rbac-proxy-main/0.log" Apr 16 22:42:57.460521 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.460493 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-w9dhd_8ff4f32e-7232-4cba-8429-a746bb63a2a3/kube-rbac-proxy-self/0.log" Apr 16 22:42:57.481894 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.481872 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-w9dhd_8ff4f32e-7232-4cba-8429-a746bb63a2a3/openshift-state-metrics/0.log" Apr 16 22:42:57.875500 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.875471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fbd9767d-rjqql_dd94387b-2cfc-4696-8aa0-7cfe19572cf7/thanos-query/0.log" Apr 16 22:42:57.895896 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.895865 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fbd9767d-rjqql_dd94387b-2cfc-4696-8aa0-7cfe19572cf7/kube-rbac-proxy-web/0.log" Apr 16 22:42:57.918481 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.918455 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fbd9767d-rjqql_dd94387b-2cfc-4696-8aa0-7cfe19572cf7/kube-rbac-proxy/0.log" Apr 16 22:42:57.943269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.943248 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fbd9767d-rjqql_dd94387b-2cfc-4696-8aa0-7cfe19572cf7/prom-label-proxy/0.log" Apr 16 22:42:57.964340 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.964304 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fbd9767d-rjqql_dd94387b-2cfc-4696-8aa0-7cfe19572cf7/kube-rbac-proxy-rules/0.log" Apr 16 22:42:57.994274 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:57.994253 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fbd9767d-rjqql_dd94387b-2cfc-4696-8aa0-7cfe19572cf7/kube-rbac-proxy-metrics/0.log" Apr 16 22:42:59.329967 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:42:59.329940 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-r575f_312dcba8-3e17-48d7-9fb2-06758dcd295e/networking-console-plugin/0.log" Apr 16 22:43:00.812390 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812357 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg"] Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812695 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="llm-d-routing-sidecar" Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812706 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="llm-d-routing-sidecar" Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812720 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerName="copy" Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812725 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerName="copy" Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812740 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerName="gather" Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812745 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerName="gather" Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812757 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812762 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" Apr 16 22:43:00.812772 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812774 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="storage-initializer" Apr 16 22:43:00.813079 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812780 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="storage-initializer" Apr 16 22:43:00.813079 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812830 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="main" Apr 16 22:43:00.813079 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812836 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerName="gather" Apr 16 22:43:00.813079 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812844 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6428449d-012b-44d2-bd22-55a2c3f144a6" containerName="llm-d-routing-sidecar" Apr 16 22:43:00.813079 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.812851 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f77a1b4-e06b-498e-b816-bdfbf22db2b8" containerName="copy" Apr 16 22:43:00.815871 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.815852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.818465 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.818444 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6fvlv\"/\"openshift-service-ca.crt\"" Apr 16 22:43:00.818608 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.818476 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6fvlv\"/\"default-dockercfg-t4h5p\"" Apr 16 22:43:00.819645 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.819627 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6fvlv\"/\"kube-root-ca.crt\"" Apr 16 22:43:00.823832 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.823812 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg"] Apr 16 22:43:00.847284 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.847261 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-podres\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.847418 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.847290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-proc\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.847418 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.847318 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-lib-modules\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.847418 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.847375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jddd\" (UniqueName: \"kubernetes.io/projected/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-kube-api-access-6jddd\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.847525 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.847461 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-sys\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.885658 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.885633 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-ghzq7_8c11f0d2-8369-4d94-8465-bfa195f8781d/volume-data-source-validator/0.log" Apr 16 22:43:00.948753 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.948720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-podres\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.948753 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.948754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-proc\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.948931 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.948782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-lib-modules\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.948931 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.948858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-proc\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.948931 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.948893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-lib-modules\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.948931 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.948904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jddd\" (UniqueName: \"kubernetes.io/projected/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-kube-api-access-6jddd\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.949147 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.948960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-sys\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.949147 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.948893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-podres\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.949147 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.949014 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-sys\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:00.957269 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:00.957248 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jddd\" (UniqueName: \"kubernetes.io/projected/2f0e9f33-7cce-41ec-b51d-e278e009e4ef-kube-api-access-6jddd\") pod \"perf-node-gather-daemonset-mvfkg\" (UID: \"2f0e9f33-7cce-41ec-b51d-e278e009e4ef\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:01.126340 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.126247 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:01.246994 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.246964 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg"] Apr 16 22:43:01.250485 ip-10-0-133-16 kubenswrapper[2572]: W0416 22:43:01.250455 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2f0e9f33_7cce_41ec_b51d_e278e009e4ef.slice/crio-7e0b323adcc7b38f20b991f5ed7113269094ac9d8b1454c6dcdcc40c4c76f8ec WatchSource:0}: Error finding container 7e0b323adcc7b38f20b991f5ed7113269094ac9d8b1454c6dcdcc40c4c76f8ec: Status 404 returned error can't find the container with id 7e0b323adcc7b38f20b991f5ed7113269094ac9d8b1454c6dcdcc40c4c76f8ec Apr 16 22:43:01.597313 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.597281 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mf4m2_2357d8ff-11e8-4af7-a6de-7223d6294ade/dns/0.log" Apr 16 22:43:01.616412 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.616358 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mf4m2_2357d8ff-11e8-4af7-a6de-7223d6294ade/kube-rbac-proxy/0.log" Apr 16 22:43:01.767177 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.767148 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wr7c8_07fc2833-ed6f-40da-ac15-b1ce9aa369c6/dns-node-resolver/0.log" Apr 16 22:43:01.879206 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.879126 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" event={"ID":"2f0e9f33-7cce-41ec-b51d-e278e009e4ef","Type":"ContainerStarted","Data":"4af45f9bfc8b5ec47b4ab2e72f4d55a1aafc260bf487417b482ceee0a8f9b753"} Apr 16 22:43:01.879206 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.879160 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" event={"ID":"2f0e9f33-7cce-41ec-b51d-e278e009e4ef","Type":"ContainerStarted","Data":"7e0b323adcc7b38f20b991f5ed7113269094ac9d8b1454c6dcdcc40c4c76f8ec"} Apr 16 22:43:01.879206 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.879184 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:01.896341 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:01.896297 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" podStartSLOduration=1.896283734 podStartE2EDuration="1.896283734s" podCreationTimestamp="2026-04-16 22:43:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:43:01.894785901 +0000 UTC m=+1762.119952306" watchObservedRunningTime="2026-04-16 22:43:01.896283734 +0000 UTC m=+1762.121450174" Apr 16 22:43:02.232322 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:02.232256 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-69fb54c674-n5bgr_5546cbc6-033e-4879-90dc-24493ae81f46/registry/0.log" Apr 16 22:43:02.238580 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:02.238556 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-69fb54c674-n5bgr_5546cbc6-033e-4879-90dc-24493ae81f46/registry/1.log" Apr 16 22:43:02.281459 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:02.281434 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7969m_57bb8201-a601-40d9-9277-43dfb8dd33cd/node-ca/0.log" Apr 16 22:43:03.171142 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:03.171075 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c7c6f9bb8-k7dmc_5e1bbae3-73bb-42df-9209-225120eedbb0/router/0.log" Apr 16 22:43:03.670709 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:03.670674 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tbnzs_fd83defb-549f-4b35-96b4-9fe5c34c7ebf/serve-healthcheck-canary/0.log" Apr 16 22:43:04.224044 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:04.224012 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-2qjf4_8642ce80-b932-4245-8f5a-3ce2e6014659/insights-operator/0.log" Apr 16 22:43:04.224580 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:04.224330 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-2qjf4_8642ce80-b932-4245-8f5a-3ce2e6014659/insights-operator/1.log" Apr 16 22:43:04.313014 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:04.312989 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-th4nh_85c31b35-d034-451f-b044-035fb08e99c1/kube-rbac-proxy/0.log" Apr 16 22:43:04.334281 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:04.334256 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-th4nh_85c31b35-d034-451f-b044-035fb08e99c1/exporter/0.log" Apr 16 22:43:04.356468 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:04.356449 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-th4nh_85c31b35-d034-451f-b044-035fb08e99c1/extractor/0.log" Apr 16 22:43:06.932015 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:06.931989 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5cdcb589b5-787l4_6f753f1e-c7b2-4ab6-8588-6a777fcfb68b/manager/0.log" Apr 16 22:43:07.801346 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:07.801321 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-x2gnk_5ccfdb52-5fff-4aed-9172-fb9f35a8dd34/s3-init/0.log" Apr 16 22:43:07.891971 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:07.891946 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-mvfkg" Apr 16 22:43:12.408511 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:12.408471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mgtzp_13cc71b0-ff0e-4ee6-9a87-fef5891bc28a/migrator/0.log" Apr 16 22:43:12.428267 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:12.428241 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mgtzp_13cc71b0-ff0e-4ee6-9a87-fef5891bc28a/graceful-termination/0.log" Apr 16 22:43:12.770250 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:12.770144 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-sblhq_477be877-0af9-4fc8-85b1-24658b74a7a8/kube-storage-version-migrator-operator/1.log" Apr 16 22:43:12.771054 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:12.771031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-sblhq_477be877-0af9-4fc8-85b1-24658b74a7a8/kube-storage-version-migrator-operator/0.log" Apr 16 22:43:14.019297 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.019268 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9jnj_5ca5fa21-17cb-4a9f-867d-d2d0533d72c3/kube-multus-additional-cni-plugins/0.log" Apr 16 22:43:14.041346 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.041322 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9jnj_5ca5fa21-17cb-4a9f-867d-d2d0533d72c3/egress-router-binary-copy/0.log" Apr 16 22:43:14.061982 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.061957 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9jnj_5ca5fa21-17cb-4a9f-867d-d2d0533d72c3/cni-plugins/0.log" Apr 16 22:43:14.082253 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.082232 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9jnj_5ca5fa21-17cb-4a9f-867d-d2d0533d72c3/bond-cni-plugin/0.log" Apr 16 22:43:14.102681 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.102663 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9jnj_5ca5fa21-17cb-4a9f-867d-d2d0533d72c3/routeoverride-cni/0.log" Apr 16 22:43:14.123804 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.123779 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9jnj_5ca5fa21-17cb-4a9f-867d-d2d0533d72c3/whereabouts-cni-bincopy/0.log" Apr 16 22:43:14.144191 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.144162 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9jnj_5ca5fa21-17cb-4a9f-867d-d2d0533d72c3/whereabouts-cni/0.log" Apr 16 22:43:14.173443 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.173409 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pkxvv_e35d7972-4269-49d3-ab9e-fcb9b7abc90d/kube-multus/0.log" Apr 16 22:43:14.268949 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.268923 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bvqqh_bb0e10b7-e200-4438-8847-608a1a6cace3/network-metrics-daemon/0.log" Apr 16 22:43:14.287095 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:14.287035 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bvqqh_bb0e10b7-e200-4438-8847-608a1a6cace3/kube-rbac-proxy/0.log" Apr 16 22:43:15.118125 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.118070 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-controller/0.log" Apr 16 22:43:15.135784 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.135758 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/0.log" Apr 16 22:43:15.143160 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.143126 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovn-acl-logging/1.log" Apr 16 22:43:15.160056 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.160007 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/kube-rbac-proxy-node/0.log" Apr 16 22:43:15.180613 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.180587 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:43:15.199882 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.199860 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/northd/0.log" Apr 16 22:43:15.220095 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.220053 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/nbdb/0.log" Apr 16 22:43:15.244443 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.244405 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/sbdb/0.log" Apr 16 22:43:15.341420 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:15.341390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6bqmn_a3bd5cbc-ecad-44b9-9c14-ff88792450fa/ovnkube-controller/0.log" Apr 16 22:43:16.951279 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:16.951251 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-8m6vz_2bfe578c-4062-4475-9e81-7dcc34aef95a/check-endpoints/0.log" Apr 16 22:43:17.023590 ip-10-0-133-16 kubenswrapper[2572]: I0416 22:43:17.023568 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zh7bj_6ea887bd-cdbc-451b-9d3d-df42e2023e31/network-check-target-container/0.log"